Talk comments

Neil Nand at 22:52 on 15 Jun 2017

Really good talk on some of the optimisations of PHP internals but I went in with the impression that there'd be more information on how PHP developers could optimise the code the write. Maybe just make it a little clearer on the talk description?

Neil Nand at 22:50 on 15 Jun 2017

Certainly a good way to give a talk on functional programming and introduce the Phunkie framework but I have a very limited knowledge of functional programming so could only follow so far.

Neil Nand at 22:48 on 15 Jun 2017

Good history lesson on computing, didn't know a lot of it & leading on to how we'll be shaping its future.

We definitely need more attention to readability and optimization for debugging, tests are useful when they fail. To be complete needs to go into matchers vs. custom assertions as in http://xunitpatterns.com/Custom%20Assertion.html and where there are advantages to use one or the other.

The first time in my life I felt I understood monads.

Wim Godden at 23:25 on 13 Jun 2017

Might need some polishing with certain examples, but the way it ended was perfect. It gave every open minded person something to think about.

Good talk. Despite using regex for a while I still picked up a few useful tips.

Great talk. Entertaining and useful.

Mark Dain at 22:02 on 12 Jun 2017

The majority of the content was good but I have some real concerns over the message it gives towards the end.

The first part of the talk covered ethical frameworks like absolute, relative, individualistic, etc... - this was fine, it's theoretical stuff taught in schools.

The next part covered applying these frameworks on a classic ethical scenario; a runaway train that can go down track A or track B. One will kill a family of 3 and one will crash the train. This was great because it showed how neither framework is neatly applicable and *each problem must be tackled on a case-by-case basis*.

Where the talk started to turn bad was towards the end, Christopher started trying to explore how technology could be used to solve social problems, such as putting microchips in guns.

Technical solutions to social problems don't work because they aren't technology problems; they're social problems that require social solutions like improved handling of mental health, teaching conflict resolution skills in schools, and educating people on the value of life.

I really think this example has to be worked on. My concern with smart guns is what happens when they're hacked? If something can be hacked, it will be hacked as we've seen with entertainment systems in cars. What if we could disarm police officers? What if the army also had smart guns and an invading force had regular guns?

If that example was trying to highlight the growing issues with the recent IoT-all-the-things trend then I absolutely wish that were made much more clear.

A great ending to a fantastic conference!

Very well presented and made me think about the responses I gave when asked to put your hands up about things that have and could happen. I've never really thought about the impact of my code that I write, both personally and for work/other people.

We, as an industry have created some of the most amazing software/devices to help improve lives in terms of needs and enjoyment but never have I stopped to think about what companies get from our satisfaction of using them and what they do with that data.