New open source foundation model released: Falcon 180B
* https://falconllm.tii.ae/falcon-models.html
* https://huggingface.co/blog/falcon-180b
* https://huggingface.co/tiiuae/falcon-180B
Claimed to be the best open source model now; better than Llama 2 and comparable to Google's PaLM 2-Large model.
Big downside: you need 400 GB of memory. Even with aggressive quantization, it will be hard to run it locally. Yet maybe interesting for researchers and companies (the license allows commercial use again).