Begin by reading the Guardian article “How Can We Stop Algorithms Telling Lies”.
In case that link disappears, there is a local copy as a PDF as well, but you should go to the original if possible.
Submit a report, with the paper number, your name, and your student ID on the front page, as a PDF file. The method is explained below.
There are four (imaginary) people. You are to say, in about half a page each, what each person should do, according to the IITP Codes. Each analysis is worth 1% of your mark for this course, a total of 4%. You must give the section numbers in the IITP Codes that justify your answers; failing to do so in any answer will cost you half of the marks for that question. If you think the IITP is wrong, state your own position, but cite the IITP as well.
There are no set answers that you must match. The task is to apply the Codes, not to come up with the proper decisions.
She has just been approached by a transport company who would like her to see how well a Deep Learning algorithm can predict driver and customer behaviour.
His current task is to improve the algorithm that decides whether a customer should be offered a loan or not and under what conditions.
Her responsibility is the software for their range of internet-connected toys. One feature of these toys is speech recognition: if a child says “I hate broccoli”, the toy might reply “What about cauliflower?”. The toys are always on. She is currently in talks with the Police about using them to watch out for signs of child abuse and domestic violence, which we all agree are bad things. But one of the Police officials accidentally let slip that they would like to catch drug names too.
Submit a report, with the paper number, your name, and your student ID on the front page, as a PDF file. Put your report in a directory containing no other PDFs and then enter the command
ethics-submit
in a terminal inside that directory. You will be asked to confirm that the correct file is being submitted and then the message “Submission complete” will be printed.