Average Ratings 0 Ratings
Average Ratings 0 Ratings
Description
Introducing the next iteration of our open-source large language model, this version features model weights along with initial code for the pretrained and fine-tuned Llama language models, which span from 7 billion to 70 billion parameters. The Llama 2 pretrained models have been developed using an impressive 2 trillion tokens and offer double the context length compared to their predecessor, Llama 1. Furthermore, the fine-tuned models have been enhanced through the analysis of over 1 million human annotations. Llama 2 demonstrates superior performance against various other open-source language models across multiple external benchmarks, excelling in areas such as reasoning, coding capabilities, proficiency, and knowledge assessments. For its training, Llama 2 utilized publicly accessible online data sources, while the fine-tuned variant, Llama-2-chat, incorporates publicly available instruction datasets along with the aforementioned extensive human annotations. Our initiative enjoys strong support from a diverse array of global stakeholders who are enthusiastic about our open approach to AI, including companies that have provided valuable early feedback and are eager to collaborate using Llama 2. The excitement surrounding Llama 2 signifies a pivotal shift in how AI can be developed and utilized collectively.
Description
The TinyLlama initiative seeks to pretrain a Llama model with 1.1 billion parameters using a dataset of 3 trillion tokens. With the right optimizations, this ambitious task can be completed in a mere 90 days, utilizing 16 A100-40G GPUs. We have maintained the same architecture and tokenizer as Llama 2, ensuring that TinyLlama is compatible with various open-source projects that are based on Llama. Additionally, the model's compact design, consisting of just 1.1 billion parameters, makes it suitable for numerous applications that require limited computational resources and memory. This versatility enables developers to integrate TinyLlama seamlessly into their existing frameworks and workflows.
API Access
Has API
API Access
Has API
Screenshots View All
No images available
Integrations
RunPod
AI4Chat
Agenta
AiAssistWorks
Aili
Alpaca
Ema
Entry Point AI
Evertune
Featherless
Integrations
RunPod
AI4Chat
Agenta
AiAssistWorks
Aili
Alpaca
Ema
Entry Point AI
Evertune
Featherless
Pricing Details
Free
Free Trial
Free Version
Pricing Details
Free
Free Trial
Free Version
Deployment
Web-Based
On-Premises
iPhone App
iPad App
Android App
Windows
Mac
Linux
Chromebook
Deployment
Web-Based
On-Premises
iPhone App
iPad App
Android App
Windows
Mac
Linux
Chromebook
Customer Support
Business Hours
Live Rep (24/7)
Online Support
Customer Support
Business Hours
Live Rep (24/7)
Online Support
Types of Training
Training Docs
Webinars
Live Training (Online)
In Person
Types of Training
Training Docs
Webinars
Live Training (Online)
In Person
Vendor Details
Company Name
Meta
Founded
2004
Country
United States
Website
ai.meta.com/llama/
Vendor Details
Company Name
TinyLlama
Website
github.com/jzhang38/TinyLlama