程序代写 COMP9417 Project: TracHack Challenge 22.2 – Predicting Eligibility for the

COMP9417 Project: TracHack Challenge 22.2 – Predicting Eligibility for the Emergency Broadband Benefit Program
March 28, 2022
Project Description
All project specific information may be found here. Overview of Guidelines

Copyright By PowCoder代写 加微信 powcoder

• The deadline to submit the report is 5pm April 20. The deadline to submit your predictions is 5pm April 17 (Sydney time) for the Internal Challenge project (ML vs. Cancer) and 11:59pm US Eastern Time April 17 for the TracHack project.
• Submission will be via the Moodle page
• You must complete this work in a group of 3-5, and this group must be declared on Moodle under Group Project
Member Selection
• The project will contribute 30% of your final grade for the course.
• Recall the guidance regarding plagiarism in the course introduction: this applies to all aspects of this project as well, and if evidence of plagiarism is detected it may result in penalties ranging from loss of marks to suspension.
• Late submissions will incur a penalty of 5% per day from the maximum achievable grade. For example, if you achieve a grade of 80/100 but you submitted 3 days late, then your final grade will be 80 − 3 × 5 = 65. Submissions that are more than 5 days late will receive a mark of zero. The late penalty applies to all group members.
Ob jectives
In this project, your group will use what they have learned in COMP9417 to construct a classifier for the specific task as well as write a detailed report outlining your exploration of the data and approach to modelling. The report is expected to be 10-12 pages (with a single column, 1.5 line spacing), and easy to read. The body of the report should contain the main parts of the presentation, and any supplementary material should be deferred to the appendix. For example, only include a plot if it is important to get your message across. The report is to be read by the client, and the client cares about the big picture, pretty plots and intuition. The guidelines for the report are as follows:
1. Title Page: tile of the project, name of the group and all group members (names and zIDs).
2. Introduction: a brief summary of the task, the main issues for the task and a short description of how you approached these issues.
3. Exploratory Data Analysis: this is a crucial aspect of this project and should be done carefully given the lack of domain information. Some (potential) questions for consideration: are all features relevant? How can we represent the data graphically in a way that is informative? What is the distribution of the classes? What are the relationships between the features?
4. Methodology: A detailed explanation and justification of methods developed, method selection, feature selection, hyper-parameter tuning, evaluation metrics, design choices, etc. State which method has been selected for the final test and its hyper-parameters.

5. Results: Include the results achieved by the different models implemented in your work, with a focus on the f1 score. Be sure to explain how each of the models was trained, and how you chose your final model.
6. Discussion: Compare different models, their features and their performance. What insights have you gained?
7. Conclusion: Give a brief summary of the project and your findings, and what could be improved on if you had more time.
8. Reference: list of all literature that you have used in your project if any. You are encouraged to go beyond the scope of the course content for this project.
Project implementation
Each group must implement a minimum of two classification methods and select the best classifier, which will be used to generate predictions for the test sets of the respective task. You are free to select the features and tune the methods for best performance as you see fit, but your approach must be outlined in detail in the report. You may also make use of any machine learning algorithm, even if it has not been covered in the course, as long as you provide an explanation of the algorithm in the report, and justify why it is appropriate for the task. You can use any open-source libraries for the project, as long as they are cited in your work. You can use all the provided features or a subset of features; however you are expected to give a justification for your choice. You may run some exploratory analysis or some feature selection techniques to select your features. There is no restriction on how you choose your features as long as you are able to justify it. In your justification of selecting methods, parameters and features you may refer to published results of similar experiments.
Code submission
Code files should be submitted as a separate .zip file along with the report, which must be .pdf format. Penalties will apply if you do not submit a pdf file (do not put the pdf file in the zip).
Peer review
Individual contribution to the project will be assessed through a peer-review process which will be announced later, after the reports are submitted. This will be used to scale marks based on contribution. Anyone who does not complete the peer review by the 5pm Thursday of Week 11 (29 April) will be deemed to have not contributed to the assignment. Peer review is a confidential process and group members are not allowed to disclose their review to their peers.
Project help
Consult Python package online documentation for using methods, metrics and scores. There are many other resources on the Internet and in literature related to classification. When using these resources, please keep in mind the guidance regarding plagiarism in the course introduction. General questions regarding group project should be posted in the Group project forum in the course Moodle page. Note: You will now have been added to the TracHack Teams group, you can post TracHack specific questions there instead of the forum to get direct help from the TracHack people.

程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com