Curious about all the hype around Machine Learning and Artificial Intelligence? Heard of "Neural Networks" and "Deep Learning" but confused about what it really means?

In this talk, you'll see what Artificial Neural Networks (ANN) look like and how they can "learn". And along the way, you'll discover how you can build your own ANN, with PHP of course!


Comments are closed.

Excellent talk, and very well presented. Nothing to complain.

Sometimes slides colors not really readable, but this is just nitpicking I guess :)

A good talk well presented butthere was a lot of information crammed in. Probably a failing of mine but I had to skim over it a lot.

Mike Lehan at 16:43 on 13 Apr 2019

Lots of quality info and with a lot to cover it was presented well. The slides had good info and the code walkthroughs were helpful. A few slides were presented without explaining (eg the tensor playground animation) and it would have been good for the bit on deep learning to be more than just "also this is deep learning". Props for putting the example code in GitHub

I loved your way of teaching! It is a complicated topic and you explained it with passion and energy, I understand a lot more about neural networks now, so that's cool!

The gifs in the beginning of the talk made me feel a bit dizzy, and some slides could be difficult to read. So the one thing I'd advice you to work on next time is your visual presentation. Sometimes simple is better.

You delivered the talk with a very clear voice and good mimic. You're a natural on stage! :D

Chris Emerson at 16:32 on 14 Apr 2019

I thinj this was a good introduction to the topic, but perhaps tried to tackle too much, meaning much of the information was skipped over quite quickly. I'm reasonably familiar with how neural networks work already, but some concepts like gradient descent, sigmoid functions etc were mentioned without really explaining why these things are used or needed.

A very good talk. I might have a look to see what I can do.

Of any negatives I would say that you kept showing the formulas however it was not always obvious how these fit in (if at all). This may have

Hicham Abdel at 09:08 on 15 Apr 2019

Great talk and good job building it from scratch in php ;)

Shaun Walker at 09:33 on 15 Apr 2019

As someone with all but a high level understanding of the concept of machine learning/neural networks. It was really interesting to see a more practical example to give a bit more realism to the whole "okay cool but how does it actually work" you get after any machine learning/AI type talk.

I agree with some others that it maybe is a little too much to try and cram into 1 talk, especially with the math. But I understand the idea was to show how (relatively) easy it is to get started with a neural network. I think this in-depth level is maybe better suited for a longer workshop where people who really want to dig into the math can get more out of it.

For a talk I personally think it could be improved (or maybe a different talk for more wider non-math audience). If you keep everything the same but instead of needing to stick to a simple XOR example. Take away the math part, replace the actual implementation details of how you write the code with just general "this is where you calculate" "this is where you work out the error" etc. so that you can maybe find a more complex/real world use case that helps bridge the gap between neural networks and real world usage, and leaves the nitty gritty of how you calculate stuff to when someone tries to implement it.

I still really enjoyed and got a lot out of the talk as someone with zero practical experience in this area and very little math knowledge.

Vitor is a really excellent speaker who brings enthusiasm and excitement in his presentation style.

Andrew Battye at 15:33 on 15 Apr 2019

I enjoyed the talk and found it very interesting. As others have said, it is a complicated topic for the uninitiated, but I thought the talk was pitched at the right level. My one suggestion would be to consider using a different example during the talk - the XOR model was trained using all possible inputs, so the model was not capable of making predictions with examples it had never seen before.