Our increasingly connected world, along with the promises of Big Data and Cloud Computing, offers us multitudes of opportunities to model the world and build powerful technology to improve not only the companies we work for but the lives of many. Yet far too often we, as engineers, focus on providing a solution to the problem at hand and overlook the problems that the solution itself might cause. In this talk, I examine some of the issues our world throws up, and asks what it takes to engineer ethically.


Comments are closed.

Liam Wiltshire at 17:57 on 10 Jun 2017

A talk designed to make you think, and it really did that. A fantastic way to round off the conference.

Matt Brunt at 18:11 on 10 Jun 2017

It covered some hard-hitting ideas and rather than trying to shy away from them, Chris brought them out into the open and really made you think about what you might do in a similar situation to the ones in the examples.

Chris' delivery was on-point. The timing, emphasis and delivery was as good as they come.

Daniel Shaw at 20:09 on 10 Jun 2017

I'm pretty sure I wasn't the only person in the audience getting drawn into this talk further and further, right up to the final sucker punch at the end. What a note to end the conference on.

Brilliant delivery. Brilliant content.

Katy Ereira at 20:28 on 10 Jun 2017

Chris is a very charismatic guy, and you always get really sucked into his talks. It was quite a sombre note to end a conference on, and it worked.

Tim Stamp at 21:10 on 10 Jun 2017

Thought provoking, and serves as a reminder of the potential our work has to do real world good, and real world harm.

Although I did think the fairly recent ethical discussions over programming of unavoidable-crashing autonomous vehicles would have had more of a mention instead of the heavy focus on one particular on-demand taxi-ish company...

Left me with some serious reflection time on the drive home. ??

Chris Emerson at 22:37 on 10 Jun 2017

Great way to finish the weekend - some really interesting things to think about here. We all like to think we'd do the right thing given an ethical situation but this talk shows it's sometimes not as straight-forward as it seems!

A thought-provoking end to the conference delivered effectively by a natural speaker.

Chris Sherry at 12:30 on 11 Jun 2017

Chris is a great speaker and a great asset to the community!

Ethics is a subject that needs to be talked about more and I think Chris did a great job of presenting the issues.

Although I've never studied ethics, it has always been of interest to me - there wasn't anything new to me for most of the talk - but for those unfamiliar with ethics this was excellent. I can certainly see why the organisers put it in as a keynote.

I found Chris' reinvention of the trolly problem didn't help describe the problem better or bring it closer to home than the original - I'd be interested in helping come up with a more likely scenario with the same problem that a software engineer may find themselves in.

The sucker punch as the end was the question of whether or not we would take action or not in a difficult situation. As a suggestion, this question could be bought forward in the talk and potentially handed over to the audience for a debate rather than a simple show of hands.

Lee Boynton at 19:21 on 11 Jun 2017

Very thought provoking talk tackling a subject that I imagine most people would rather not think about. It was good to be reminded that as developers/engineers we all have a responsibility to think ethically and think about how technology can be used for bad, and what we can do about it.

If I was to make any suggestion, it would be that perhaps some of the analogies could have been thought of before the talk rather than on the spot. This may have made them a bit clearer.

It did go a bit dark at points, especially when the avocado at toast was put at risk, but this was the point of the talk.

Thanks Chris!

Gary Jones at 00:47 on 12 Jun 2017

Definitely thought-provoking. Chris delivered the content well, with the right level of humour for an otherwise dark thought exercise. For a talk like this, there are no black or white, right or wrong answers - that's the whole point. A clever end to the conference.

Naomi Gotts at 19:52 on 12 Jun 2017

I'm afraid I found this talk way too dark and not quite the mood I'd hoped to be walking away from the conference in. Yes, it was thought provoking but too much time spent talking about dark things. At the end the room felt too reflective and almost sombre. "That's it, you can all leave now". Yes, I can absolutely see the points that this talk was making and there's definitely a place for it; but just not sure it should have been the final note of the conference. The opening keynote on Friday by contrast was uplifting and inspiring, this one not so much.

A great ending to a fantastic conference!

Very well presented and made me think about the responses I gave when asked to put your hands up about things that have and could happen. I've never really thought about the impact of my code that I write, both personally and for work/other people.

We, as an industry have created some of the most amazing software/devices to help improve lives in terms of needs and enjoyment but never have I stopped to think about what companies get from our satisfaction of using them and what they do with that data.

Mark Dain at 22:02 on 12 Jun 2017

The majority of the content was good but I have some real concerns over the message it gives towards the end.

The first part of the talk covered ethical frameworks like absolute, relative, individualistic, etc... - this was fine, it's theoretical stuff taught in schools.

The next part covered applying these frameworks on a classic ethical scenario; a runaway train that can go down track A or track B. One will kill a family of 3 and one will crash the train. This was great because it showed how neither framework is neatly applicable and *each problem must be tackled on a case-by-case basis*.

Where the talk started to turn bad was towards the end, Christopher started trying to explore how technology could be used to solve social problems, such as putting microchips in guns.

Technical solutions to social problems don't work because they aren't technology problems; they're social problems that require social solutions like improved handling of mental health, teaching conflict resolution skills in schools, and educating people on the value of life.

I really think this example has to be worked on. My concern with smart guns is what happens when they're hacked? If something can be hacked, it will be hacked as we've seen with entertainment systems in cars. What if we could disarm police officers? What if the army also had smart guns and an invading force had regular guns?

If that example was trying to highlight the growing issues with the recent IoT-all-the-things trend then I absolutely wish that were made much more clear.

Wim Godden at 23:25 on 13 Jun 2017

Might need some polishing with certain examples, but the way it ended was perfect. It gave every open minded person something to think about.

Neil Nand at 23:07 on 15 Jun 2017

A good entertaining talk although I did think it lost focus a bit on the ethics of engineering part way through.

James Titcumb at 10:57 on 16 Jun 2017

I was really looking forward to this. It was thought-provoking (maybe a bit dark heh), but that was a great end to the conference IMO. Chris is an excellent speaker, and this was a very well delivered and timed talk. Top notch stuff.