After searching for a few hours, I can't find any examples of text-based Neural Networks. Image processing is great and all but NLP is key to what I am building.
This is not an accident: v11.0 is not very good for text. This was reserved for 11.1:
1) have full RNN support
2) support variable length sequences
3) have appropriate NetEncoders for text
4) Generalize existing layers (add 1-d convolutions, make EmbeddingLayer accept sequences)
Regarding your example: without NetEncoders for text, its up to you to convert the text to some appropriate tensor representation that can be fed into a conv net. It obviously has no idea what to do with pure text.