程序代写代做 distributed system gui C graph THE UNIVERSITY OF HUDDERSFIELD

THE UNIVERSITY OF HUDDERSFIELD
School of Computing and Engineering
ASSIGNMENT SPECIFICATION
Module Code
CHS2546
Module Title
Distributed and Client Server Systems
Course Title/s
MEng/BSc Software Engineering, MSci/BSc Computer Science, MComp/BSc Computing, BSc Computer Science with…
Assignment Details
Title
Assignment 2 – a Corba Air Quality Monitoring System
Weighting
50 %
Mode of working for assessment task.
☐ Individual
There should be no collusion or collaboration whilst working on and subsequently submitting this assignment.
Module Leader
Dr Gary Allen
Contact details: g.allen@hud.ac.uk, ext 2152
Module Tutor/s

Hand-out date
Term 2 Week 4, w/c Monday 10/02/20
How to submit your work
Via Brightspace (reports to be automatically checked by TurnItIn)
Submission date/s and times
12.00 lunchtime on Friday 01/05/20
Expected amount of independent time you should allocate to complete this assessment
20 hours, including some time spent in practicals supporting the assignment.
Submission type and format
Reports should preferably be in PDF format. You will also be asked to submit your code. More information on this will be provided closer to the hand-in date.

Date by which your grade and feedback will be returned
End of Year.
1

Your responsibility
It is your responsibility to read and understand the University regulations regarding assessment. http://www.hud.ac.uk/registry/regulationsandpolicies/studentregs.
Please pay special attention to the assessment regulations (section 4) on Academic Misconduct
In brief: ensure that you;
1. DO NOT use the work of another student – this includes students
from previous years and other institutions, as well as current students
on the module.
2. DO NOT make your work available or leave insecure, for other
students to view or use.
3. Any examples provided by the module tutor should be appropriately
referenced, as should examples from external sources.
Further guidance can be found in the SCEN Academic Skills Resource and UoH Academic Integrity Resource module in Brightspace.
If you experience difficulties with this assessment or with time management, please speak to the module tutor/s, your Personal Academic Tutor, or the School’s Guidance Team. (sce.guidance@hud.ac.uk).
Requesting an Extension
You are reminded to ‘back-up’ your work as extensions will not be given for lost work, which includes work lost due to hardware and software failure/s.
Extension requests will only be approved if you can demonstrate genuine, unexpected circumstances along with independent supporting evidence (e.g medical certificate) that may prevent you submitting an assessment on time.
Submit an extension request via Student Portal within 2 working days of the due date.
Extension requests, up to a maximum of 10 working days, but typically 1- 5 working days, will be considered provided that there is appropriate evidence which clearly indicates reasons for the request.
You will have 5 working days after submitting a request to provide the evidence. Failure to submit evidence will result in the request being rejected and your work being marked as a late submission.
If you are unable to submit work within the maximum extension period of 10 days, contact the School’s Guidance team (sce.guidance@hud.ac.uk), as you may need to submit a claim for Extenuating Circumstances (ECs).
Extenuating Circumstances (ECs)
An EC claim is appropriate in exceptional circumstances, when an extension is not sufficient due to the nature of the request, or it
2

concerns an examination or In-Class Test (ICT).
You can access the EC claim form via MyHud or Registry website; https://www.hud.ac.uk/registry/extenuatingcircumstancesfaqs where you can also find out more about the process.
You will need to submit independent, verifiable evidence for your claim to be considered.
Once your EC claim has been reviewed you will get an EC outcome email from Registry. If you are unsure what it means or what you need to do next, please speak to the Student Support Office – SJ1/01
An approved EC will extend the submission date to the next assessment period (e.g July resit period).
Late Submission (No ECs approved)
Late submission, up to 5 working days, of the assessment submission deadline, will result in your grade being capped to a maximum of a pass mark.
Submission after this period, without an approved extension, will result in a 0% grade for this assessment component.
Tutor Referral available
☐ Yes:- ☐ No:
Resources
Please note: you can access free office software and you have
1 Tb of free storage space available on Microsoft’s OneDrive system.
https://students.hud.ac.uk/it/unimail/office365/
Access to labs running IntelliJ (also available for use at home)

3

Corba Air Quality Monitoring System 1. Assignment Aims
To enable students to demonstrate skills in client-server systems development using an appropriate middleware platform (Corba being the platform taught, but students may negotiate with the module leader to use other platforms with which they are familiar if they wish to do so).
2. Learning Outcomes:
This coursework assesses the following learning outcomes:
1.1 Be able to describe the structure of distributed systems.
1.2 Assess developments in the area and understand their impact on the software development process
1.3 Be able to describe the design issues relating to the development of distributed information systems.
1.4 Be familiar with the range of products and frameworks available to aid in the development of distributed software.
2.1 Analyse a problem, identifying client applications, services and distributed structures to satisfy the requirements.
2.2 Design, implement and test the components which have been derived. 2.3 Use proprietary solution frameworks and services.
3. Assessment Brief
Background
Environmental monitoring has become increasingly important over recent years. In particular, the UK’s failure to meet EU Environmental Standards for air quality in city centres, caused mainly by air pollution from petrol and diesel engined vehicles, has led to an increased focus on how to monitor the level of pollutants in the atmosphere, and how to warn the public of areas of particular concern. A new company, Environmental Monitor, is looking to develop a low cost system to monitor air pollution throughout city centres. Starting initially at busy junctions and known traffic black spots, the company plans to roll out monitoring equipment throughout the UK, collecting real time readings of a range of pollutants, staring with nitrogen oxides (Nox) levels. The company would like to have the ability both to retrieve data from monitoring stations (i.e. to use a ‘pull’ technology) on a regular basis (typically every hour, but this is subject to further discussion), and for the monitoring stations to contain sufficient on-board (edge) processing for them to be able to raise alarms (i.e. use a ‘push’ technology) should extreme or unexpected readings arise. In order to reduce the possibility of false alarms, two monitoring stations in the same geographical area must raise an alarm before this is passed on to the Environmental Centre for analysis.
The company is looking to develop a prototype distributed Client-Server implementation, based around the CORBA architecture. It is envisaged that a monitoring station will eventually contain sensors to measure various pollutants and
4

indicators of air quality, but initially the company will focus on nitrogen oxides as a basic proof of concept. The company has decided on an architecture for the system that comprises: the Monitoring Station, which can be installed wherever required and communicate with the rest of the system over wireless 4G networks; a Local Server, to which all monitoring stations in a particular geographical area will be connected; and a central Monitoring Centre, to which the local servers will report. The idea behind this architecture is both to distribute the processing and communications within the system, and to provide resilience should, for example, the central monitoring centre become unavailable for some period of time. The company have asked you to prototype the system, and to make recommendations as to the suitability of the architecture and the platform selected.
Requirements
You have been asked to develop a A CORBA-based client-server system. The requirements of the system can be broken down into a number of separate systems which will need to communicate in a client-server manner to solve the overall requirements.
1. The Monitoring Station
The Monitoring Station is a stand-alone monitoring system, to be prototyped as a CORBA server, that supports the following functionality at least:
 Register itself with a Regional Centre upon initial activation
 Can be remotely activated
 Can be remotely deactivated
 Can be remotely reset
 Can return, upon request, the current value of the nitrogen oxides sensor
 Can identify anomalous or potentially dangerous readings of nitrogen oxides
and alert the Local Server immediately.
Note that, for this prototype implementation, there is no expectation that real sensors be used. Instead a (simple) GUI is envisaged as representing the state of the sensor.
A preliminary IDL starting specification might be: struct NoxReading {
long time;
long date;
string station_name; long reading_value;
5

};
interface MonitoringStation {
readonly attribute string station_name; readonly attribute string location; NoxReading get_reading();
void activate();
void deactivate();
void reset();
};
2. The Local Server
The Local Server is to be prototyped as a CORBA server that supports the following functionality at least:
 Receives requests to register Monitoring Stations and maintains a list of connected devices
 Receives alerts from connected Monitoring Stations, and maintains a log of these alerts
 Triggers an alarm at the Environmental Centre when two alarms happen within a specified time frame
 Returns the log upon request
 Polls all connected Monitoring Stations when requested to do so, and returns
a set of readings
A preliminary IDL starting specification might be:
Typedef sequence Log_of_alarm_readings; Typedef sequence Set_of_readings;
interface RegionalCentre {
readonly attribute string name;
readonly attribute location_name;
readonly attribute Log_of_alarm_readings log;
6

void raise_alarm (in NoxReading alarmReading); Set_of_readings take_readings();
void add_monitoring_station(in string station_name, in string
station_location, in string station_ior); };
3. The Monitoring Centre
The Monitoring Centre is to be prototyped as a CORBA server that supports the following functionality at least:
 Receives confirmed alarms from Local Servers
 Alerts the operator when a confirmed alarm is received
 Allows agencies (e.g. the Environment Agency, local councils, local pressure groups, etc.) to register for notifications in particular areas in case of alarms
 Maintains a list of connected Local Servers
 Polls all Local Servers upon request and displays the results of readings
returned, highlighting readings of concern
A preliminary IDL starting specification might be: interface MonitoringCentre {
void raise_alarm(in NoxReading alarm_reading);
void register_agency(in string who, in string contact_details,
in string area_of_interest);
void register_local_server(in string server_name)
};
DO NOTE that all of the above IDL has been given as a starting point. It could be incomplete, it could be better structured, it definitely contains a few errors, and it does require further development.
To make these tasks easier, you may make the following assumptions: 1. Only one alarm can be serviced at a time.
7

2. Two Local Servers is a reasonable starting point for the purpose of testing the full functionality of your simulation.
3. Two Monitoring Stations connected to each Local Server is a minimally sensible start, but for full testing more will almost certainly be needed, and support for locations, or named areas (with one or more local servers in each location) should be considered.
4. Any user interfaces need not be very sophisticated (this is NOT a GUI intensive application – there are NO MARKS for a flash GUI with no functionality!). Remember that in reality the monitoring stations and local servers would be deployed as devices with no user interface at all.
5. Any ambiguities may be resolved to your own satisfaction as long as they are clearly justified.
The focus of this assignment is to encourage you to construct a software model of the Environmental Monitoring scenario in which functionality is achieved by using the services provided by different classes of software objects.
Hints and Tips
The following may help you to undertake this assignment: 1. Understanding is essential.
You cannot build a system until you understand what is required. Take time to draw a map of all of the different parts of the system, identify which are the clients and which are the servers, identify the information flow from class to class. Allocate functions to each class as required. Once you have a clear idea of what you are trying to do you can then start to implement it.
2. The incremental approach to software construction.
An advantage of the object-oriented approach is that you do not need to write a full working system at the outset. Development can be on a step-by-step basis. Utilise functions that do nothing. As you add each function to the class definition, you also add the corresponding code to your client program to use the function and make sure it works sensibly.
You are strongly advised to take the incremental approach – so that you achieve the bare bones of a working system at a relatively early stage.
Hand In
You are required to hand in a report containing the following components:
8

1. System Design (to include as a minimum a use case diagram, a class diagram, and at least 2 relevant and non-trivial sequence diagrams, along with approximately half a page of supporting discussion and justification of the design decisions made).
2. All final IDL specifications.
3. All source code (you need NOT include the code auto-generated from the IDL).
4. Evidence of system behaviour and a detailed evaluation of the extent to which you have been able to implement the features required. This analysis should highlight not only your successes but also any outstanding incomplete, incorrect, or awkward features.
5. A critique of the work undertaken, which MUST include a critical evaluation of the chosen architecture, and a discussion of at least one alternative solution to the given problem, identifying the advantages and disadvantages of each. For example, you could compare your solution to another, alternative CORBA architecture, or you could compare your CORBA solution to a Web Services solution. You should seek to highlight advantages and disadvantages of each approach. This report should be approximately one to two sides of A4.
It is very difficult to give a word count for a programming assignment, but I would expect the design to run to about 4 – 5 sides, including diagrams, and the testing should also be around 4 -5 sides. If the code meets the minimum specification (above) then it can gain an ‘A’ grade. To get an A+ you need to add some extra features, ensure the system is highly robust, and generally make a high quality product.
The report will be submitted to the TurnItIn plagiarism checking service via the module’s Brightspace page. Further details of how to submit the work will be provided closer to the due date. You must also hand in all code developed along with instructions for running the code. Full details of how to submit the code will be provided close to the hand in date.
You might also be asked to demonstrate your system, in order to allow the functionality and reliability of the software to be assessed, but note that the presentation itself will not contribute to the grade.
4. Marking Scheme
Your work will be marked on the quality of the design of the software (30% of the marks, assessed from the quality of the documentation produced and from examination of the code), the quality of the code handed in (including completeness, correctness, layout, comments, selection of variable names, etc., and worth 30% of the marks), the quality of the testing carried out (20% of the marks), and the quality of the evaluation/critique produced (20% of the marks).
9

5. Grading Rubric
The list shown below gives a rough idea of the grade to be awarded to the differing
standards of work which might be submitted.
A Grade A well presented piece of work, well structured program code implementing an appropriate design, thorough testing with well chosen test data, and a thoughtful evaluation/critique of what has been achieved.
B Grade Similar to an A grade but with deficiencies in one or two areas.
C Grade A competent piece of work with a good (but not ideal) design, code that mostly works but is incomplete or contains a few bugs, a reasonable attempt at testing, and a reasonable evaluation/critique.
D Grade A poor but mostly usable design, code which compiles but does not work fully, poor or incomplete testing, and an evaluation/critique which lacks depth.
E/F Grade Poor design, faulty code, little testing.
10