AWS Logo
Menu

AWS BedRock - Boto3 Demo - Mistral AI Models

Explore Mistral and Mixtral Model from AWS Bedrock using Boto3

Published Mar 10, 2024

Previous Blog on this Learning Series

All 10 Blogs are under Learning series

Github Link - Notebook

Environment Setup

I am using vscode local environment with AWS Credential configured.

Install Latest Python

Upgrade pip

Install latest boto3,awscli, boto3-core

Load the Library

Mistral 7B Model

  • Mistral 7B is a 7-billion-parameter language model designed for exceptional performance and efficiency in Natural Language Processing (NLP).
  • It outperforms previous top models such as Llama 2 13B and Llama 1 34B across various benchmarks including reasoning, mathematics, and code generation.
  • Leveraging grouped-query attention (GQA) and sliding window attention (SWA), Mistral 7B achieves superior performance without sacrificing efficiency.
  • Mistral 7B models are released under the Apache 2.0 license, facilitating easy deployment and fine-tuning for diverse tasks.

Set the Prompt

Configure the Model configuration

Configure the Model configuration

Invoke the Model

Parse the response for Text Completion

Text Completion

like history, how to reach, best time to visit, etc. just the names of the monuments or places: 1. Taj Mahal, Agra, Uttar Pradesh 2. Red Fort, Delhi 3. Qutub Minar, Delhi 4. Hampi, Karnataka 5. Mahabodhi Temple, Bodhgaya, Bihar 6. Ajanta and Ellora Caves, Aurangabad, Maharashtra 7. Konark Sun Temple, Konark, Odisha 8. Khajuraho Temples, Madhya Pradesh 9. Meenakshi Amman Temple, Madurai, Tamil Nadu 10. Brihadeeswara Temple, Thanjavur, Tamil Nadu. These are just a few of the many monument places in India that are worth visiting for their historical, cultural, and architectural significance.

Mixtral 8x7B

  • Mixtral 8x7B model, a Sparse Mixture of Experts (SMoE) language model. Mixtral shares the same architecture as Mistral 7B but differs in that each layer consists of 8 feedforward blocks (experts), with a router network selecting two experts to process each token at every layer. Despite having access to 47B parameters, Mixtral only utilizes 13B active parameters during inference. Mixtral outperforms or matches Llama 2 70B and GPT-3.5 across various benchmarks, especially excelling in mathematics, code generation, and multilingual tasks.

Set the Prompt

Configure the Model configuration

Invoke the Model

Parse the response for Text Completion

Text Completion

1. Great Barrier Reef, Australia 2. Palau, Micronesia 3. Blue Hole, Belize 4. Raja Ampat, Indonesia 5. Maldives, Indian Ocean 6. Cozumel, Mexico 7. Galapagos Islands, Ecuador 8. Sipadan Island, Malaysia 9. Fiji, South Pacific 10. Bonaire, Caribbean Netherlands These are some of the top scuba diving destinations in the world. Each location offers unique underwater experiences, such as diverse marine life, coral reefs, clear waters, and exciting dive sites. I highly recommend researching each location to find the one that best suits your interests and skill level.
 

Comments