These days engineers can't escape the notion that Industrial Internet of Things (IIoT) and Big Data are going to revolutionize automation, so I thought it would be a good idea to learn about Hadoop using WL. In 2013 Wolfram announced 'Mathematica gets big data with HadoopLink', so I thought it would be instructive to play around in a Notebook.
https://blog.wolfram.com/2013/07/31/mathematica-gets-bigdata-with-hadooplink/
To my disappointment it seems Needs[HadoopLink`] died on the vine. Can someone from Wolfram explain why?
If Wolfram are serious about WL being a credible platform for Data Analytics it will need to provide us with a way of accessing and computing with Big Data. Perhaps the strategy is to get Data Analytics working with Little Data first? And perhaps after that a free developers kernel can be run on the slave nodes? Whatever the game plan, it would be good to know where we're heading.