Average Ratings 0 Ratings

Total
ease
features
design
support

No User Reviews. Be the first to provide a review:

Write a Review

Average Ratings 0 Ratings

Total
ease
features
design
support

No User Reviews. Be the first to provide a review:

Write a Review

Description

Large language models, often requiring extensive computational resources for training over long periods, have demonstrated impressive proficiency in zero- and few-shot learning tasks. Due to the high investment needed for their development, replicating these models poses a significant challenge for many researchers. Furthermore, access to the few models available via API is limited, as users cannot obtain the complete model weights, complicating academic exploration. In response to this, we introduce Open Pre-trained Transformers (OPT), a collection of decoder-only pre-trained transformers ranging from 125 million to 175 billion parameters, which we intend to share comprehensively and responsibly with interested scholars. Our findings indicate that OPT-175B exhibits performance on par with GPT-3, yet it is developed with only one-seventh of the carbon emissions required for GPT-3's training. Additionally, we will provide a detailed logbook that outlines the infrastructure hurdles we encountered throughout the project, as well as code to facilitate experimentation with all released models, ensuring that researchers have the tools they need to explore this technology further.

Description

Pixtral Large is an expansive multimodal model featuring 124 billion parameters, crafted by Mistral AI and enhancing their previous Mistral Large 2 framework. This model combines a 123-billion-parameter multimodal decoder with a 1-billion-parameter vision encoder, allowing it to excel in the interpretation of various content types, including documents, charts, and natural images, all while retaining superior text comprehension abilities. With the capability to manage a context window of 128,000 tokens, Pixtral Large can efficiently analyze at least 30 high-resolution images at once. It has achieved remarkable results on benchmarks like MathVista, DocVQA, and VQAv2, outpacing competitors such as GPT-4o and Gemini-1.5 Pro. Available for research and educational purposes under the Mistral Research License, it also has a Mistral Commercial License for business applications. This versatility makes Pixtral Large a valuable tool for both academic research and commercial innovations.

API Access

Has API

API Access

Has API

Screenshots View All

No images available

Screenshots View All

No images available

Integrations

AI-FLOW
Airtrain
Arize Phoenix
BlueGPT
Continue
EvalsOne
Fleak
HoneyHive
Humiris AI
Klee
LM-Kit.NET
LibreChat
Literal AI
Mathstral
Memo AI
NexalAI
OpenLIT
SydeLabs
Verta
promptmate.io

Integrations

AI-FLOW
Airtrain
Arize Phoenix
BlueGPT
Continue
EvalsOne
Fleak
HoneyHive
Humiris AI
Klee
LM-Kit.NET
LibreChat
Literal AI
Mathstral
Memo AI
NexalAI
OpenLIT
SydeLabs
Verta
promptmate.io

Pricing Details

No price information available.
Free Trial
Free Version

Pricing Details

Free
Free Trial
Free Version

Deployment

Web-Based
On-Premises
iPhone App
iPad App
Android App
Windows
Mac
Linux
Chromebook

Deployment

Web-Based
On-Premises
iPhone App
iPad App
Android App
Windows
Mac
Linux
Chromebook

Customer Support

Business Hours
Live Rep (24/7)
Online Support

Customer Support

Business Hours
Live Rep (24/7)
Online Support

Types of Training

Training Docs
Webinars
Live Training (Online)
In Person

Types of Training

Training Docs
Webinars
Live Training (Online)
In Person

Vendor Details

Company Name

Meta

Founded

2004

Country

United States

Website

www.meta.com

Vendor Details

Company Name

Mistral AI

Founded

2023

Country

France

Website

mistral.ai/news/pixtral-large/

Product Features

Alternatives

T5 Reviews

T5

Google

Alternatives

Mistral Small Reviews

Mistral Small

Mistral AI
Falcon-40B Reviews

Falcon-40B

Technology Innovation Institute (TII)
CodeQwen Reviews

CodeQwen

Alibaba
Mistral 7B Reviews

Mistral 7B

Mistral AI