Nice project! I wanted to run your code, but there's many missing definitions, e.g. convolutionBlock, ArrayLayer, vocabulary, some .wlnet files etc.) do you think you can add them so it is runnable?
Full Notebook can be found here : https://notebookarchive.org/2020-07-6hfo8lo
-- you have earned Featured Contributor Badge Your exceptional post has been selected for our editorial column Staff Picks http://wolfr.am/StaffPicks and Your Profile is now distinguished by a Featured Contributor Badge and is displayed on the Featured Contributor Board. Thank you!
Hi Mike,
Thank you for your interest in my project. I have uploaded .wlnet files here https://github.com/sumansid/Text-Generation-using-GANs and will also add the full notebook. Please let me know if you have any questions.
Impressive! Really well written. I like the GIF!
GANs are very unstable during training as there are two networks that are dependent on each other for improvements in their performance. While several methods to improve the stability were implemented in the GAN architecture, they still suffer from non-convergence. Due to their unstable nature, an early stopping method fails to retrieve the best performing GAN model. In order to solve this, we will save a checkpoint of the model in every round along with the n-gram performance metric to select the best performing model.
This stuck out to me as similar to my experience implementing a GAN in Tensorflow2. The instabilities were maddening. Still trying to improve some problems with mode collapse. Glad to hear your model selection strategy eventually worked.
Thank you so much !
Thanks Pedro ! I'm glad you liked the project :D
Thank you. Good to hear that you had similar experience. Check out GAN Hacks by Soumith Chintala. https://github.com/soumith/ganhacks .
Excellent work.
Missing definitions such as ArrayLayer... Notebook does not work... Any ideas?