Hitching Onto The AI Wagon

Submitted by Xilodyne on Sun, 07/23/2017 - 17:41
neuromorphic chip

If you can look beyond the snide English writing (and be thankful one never had to work with this person) a fairly interesting article on the different methods of trying to get thinking systems to do something productive over the decades: https://www.theregister.co.uk/2017/07/21/artificial_intelligent/

But I think he misses the point of where exactly we are in the current technology cycle, especially in regard to machine learning.   Hitching your wagon to a technology wave takes a bit of luck to figure out what to focus on to give you access to employment and a decent salary.  I've managed to catch two waves based upon Internet growth, the first being Java html / server development in the late 1990's and then cyber security in the mid 2000's.  The amount of innovation occurring with machine learning will continue and it's effect on AI is same type of opportunity.  It certainly feels like the excitement that existed before the DotBomb Internet bubble burst of 2001 (and there hasn't been any mention of a bubble just yet).

Right now the difficulty is that the machine learning / AI positions are few and far between compared to other types of development positions.  Most of the related machine learning work seems to be in the area of Data Science, usually requiring graduate level degrees, and less on coding solutions.

Upcoming employment opportunities will increase substantially as neuromorphic chips, devoted to mimic the brain, improve (like here, or here).  And with neural network designed chips (looking forward to trying out the most recent USB Movidius Neural Compute Stick).  These new chips will appear in cell phones, cars, everywhere there are IoT devices, not to mention in data centers and SMEs. 

As in earlier tech waves there will be tremendous work to be done in all areas of the economy integrating these chips into business process in order to maintain a company's competitive advantage.  When the wave starts to crest then some money can be made.  (We're seeing this already with companies devoted solely to creating learning systems.)  Of course this is assuming that the chips do not become so smart that they do not need developers for solutions and integration (I haven't seen it yet, but where there is a problem solutions often follow) and that the current effort by large companies to have AI write software stays on the fringes of actual development (a beginning is Microsoft's DeepCoder).