Exploring OpenAI's Impact, Learning Resources, and Research Topics

OpenAI Demo and Productivity Tools:

  • OpenAI demo of wireframe to working code was impressive
  • Discussion on how it could be used for productivity tools like email, document writing, and slide creation
  • Some productivity and note apps may become irrelevant after this technology

Deep Learning/NLP:

  • Discussion on various resources for learning about deep learning and NLP, including courses and papers
  • Recommendation to watch Karpathy’s nonogpt video
  • Discussion on studying existing models and their building blocks
  • Xavi Amatriain’s catalog may be overwhelming for beginners
  • Non-technical questions are welcome

Transformers and Fine-Tuning:

  • Discussion on how transformers have taken over the field previously dominated by RNN variants
  • Fine-tuning any LLM is tricky
  • Prompt optimization may be more lucrative than fine-tuning
  • Prefix/prompt tuning may be the way to go for applications

Art and Stable Diffusion:

  • Discussion on using Stable Diffusion for art and prompt engineering
  • Playgroundai and civitai.com mentioned as resources
  • Prompt engineering changes with each new release
  • Next step after playing with basic prompts on Midjourney and Dall-e may be exploring image2image, in-painting, out-painting, and instruct pix2pix
  • Discussion on integrating text2image providers with full-fledged tools like Canva and Photoshop

Research and Emergent Properties:

  • Discussion on how bigger models have more emergent properties
  • Question on whether there are works that show when these properties arise during training

Miscellaneous:

  • Discussion on organizing an activity for Bangalore folks
  • RNNs are fighting back
  • Link to Google palm blog post on scaling language models

The description and link can be mismatched because of extraction errors.

  • https://twitter.com/prashanthshanm/status/1635646028648177669?s=20 - A tweet with the message “Too late, too little 😂😅😂😅”
  • https://web.stanford.edu/class/cs25/, https://web.stanford.edu/class/cs224n/index.html#schedule, https://people.cs.umass.edu/~miyyer/cs685/ - Few courses which can explain things in details (But most of these courses will be out of date by at least few weeks/ months). With my limited understanding, I think fine-tuning any LLM is quite tricky. The message in the same link as the URL is related to the link.
  • https://web.stanford.edu/class/cs25/, https://web.stanford.edu/class/cs224n/index.html#schedule, https://people.cs.umass.edu/~miyyer/cs685/ - Few courses which can explain things in details (But most of these courses will be out of date by at least few weeks/ months). With my limited understanding, I think fine-tuning any LLM is quite tricky. The message in the same link as the URL is related to the link.
  • https://web.stanford.edu/class/cs25/, https://web.stanford.edu/class/cs224n/index.html#schedule, https://people.cs.umass.edu/~miyyer/cs685/ - Few courses which can explain things in details (But most of these courses will be out of date by at least few weeks/ months). With my limited understanding, I think fine-tuning any LLM is quite tricky. The message in the same link as the URL is related to the link.
  • https://www.ai-art.dev/web-uis-for-stable-diffusion: A link to a write-up on different web UIs related to stable diffusion. The message also suggests exploring image2image, in-painting, out-painting, and instruct pix2pix. The link is shared in a LinkedIn post about organizing an activity in Bangalore.
  • https://www.ai-art.dev/web-uis-for-stable-diffusion: A write-up on different web UIs for stable diffusion, shared by Sayak Paul on LinkedIn along with a message about working with artists to empower them more.
  • https://mobile.twitter.com/arankomatsuzaki/status/1635453248252391427: A tweet discussing the use of RNNs in empowering artists and questioning the emergence of properties during training.
  • https://ai.googleblog.com/2022/04/pathways-language-model-palm-scaling-to.html?m=1 - The blog post discusses the relationship between model size and emergent properties in language models, and asks whether these properties arise gradually or all at once during training. A GIF from the post is also referenced for additional context.
  • https://ai.googleblog.com/2022/04/pathways-language-model-palm-scaling-to.html?m=1 - The blog post discusses the relationship between model size and emergent properties in language models, and asks whether these properties arise gradually or all at once during training. A GIF from the post is also referenced for additional context.