Write My Paper Button

WhatsApp Widget

100% Human-Written Assignment & Research Help

Plagiarism-Free Papers, Dissertation Editing & Expert Assignment Assistance

CS7DS2 Optimisation For Machine Learning Assignment: Mini-Batch SGD, Overfitting Analysis & Gradient Algorithms

Supplemental Assignment CS7DS2 Optimisation for Machine Learning

  • You must do this assignment entirely yourself – you must not discuss or collaborate on the assignment with other students in any way, you must write answers in your own words and write code entirely yourself. If you use any online or other external content in your report you should take care to cite the source. It is mandatory to complete the declaration that the work is entirely your own and you have not collaborated with anyone – the declaration form is available on Blackboard. All submissions will be checked for plagiarism.
  • Reports must be typed and submitted as a separate pdf on Blackboard (not as part of a zip file).
  • Include the source of code written for the assignment as an appendix in your submitted pdf report (the code itself, not a screenshot, so the plagiarism checker can run on it). Also include a separate zip file containing the executable code and any data files needed. Programs should be running code written in Python. Keep code brief and clean with meaningful variable names etc.
  • Important: Your primary aim is to articulate that you understand what you’re doing – not just running a program and quoting numbers it outputs. Generally most of the credit is given for the explanation/analysis as opposed to the code/numerical answer.
  • Reports should typically be about 5 pages, with 10 pages the upper limit (excluding appendix with code).
Are You Searching Answer of this Question? Request Ireland Writers to Write a plagiarism Free Copy for You.
Get A Free Quote

Assignment

  1. It is thought that the “noise” added to the gradient when using mini-batch SGD acts as a regulariser and so helps prevent overfitting. Your task is to write a short report critically evaluating the use of mini-batch SGD to reduce overfitting and improve generalisation performance.The choice of model and dataset to use is up to you, but you need to justify why your choice is appropriate – in particular you should take care to make sure that overfitting takes place when using gradient descent (or SGD with a large batch size). Its also probably a good idea to look at two models/datasets so that you can compare/contrast them. You should investigate the role of mini-batch size on overfitting (remember that SGD becomes gradient descent the batch size equals the full dataset), and also the interplay between step size and mini-batch size. You should also investigate the use of constant-step size vs adaptive approaches such as adam.

    Be sure to split the data into training, test and validation sets (cross-validation does not measure generalisation behaviour adequately). Also bear in mind the random nature of SGD, which means that you will probably need to do multiple runs and look at both the average behaviour and the fluctuations about the average from run to run.

    Remember that most of the marks are for critical analysis/discussion.

    85 marks: indicative breakdown (i) methodology 30 marks, (ii) evaluation and critical discussion 45 marks, (iii) report organisation and presentation 10 marks.

    1. Give short code for mini-batch SGD and explain its operation. [5 marks]
    2. Explain how a projected gradient descent can be used to enforce a constraint on the decision variables. Illustrate with a brief example. [5 marks]
    3. Briefly describe the RMSProp algorithm. Discuss how it differs from Adagrad (and why), and how it can change both the magnitude and direction of the step taken. [5 marks]
Get Solution of this Assessment. Hire Experts to solve this assignment for you Before Deadline.
Get A Free Quote

CS7DS2 Optimisation For Machine Learning Assignment: Mini-Batch SGD, Overfitting Analysis & Gradient Algorithms
Scroll to top