Remove AWS Remove Generative AI Remove Guidelines
article thumbnail

Multi-LLM routing strategies for generative AI applications on AWS

AWS Machine Learning - AI

Organizations are increasingly using multiple large language models (LLMs) when building generative AI applications. For detailed implementation guidelines and examples of Intelligent Prompt Routing on Amazon Bedrock, see Reduce costs and latency with Amazon Bedrock Intelligent Prompt Routing and prompt caching.

article thumbnail

Principal Financial Group uses QnABot on AWS and Amazon Q Business to enhance workforce productivity with generative AI

AWS Machine Learning - AI

With the QnABot on AWS (QnABot), integrated with Microsoft Azure Entra ID access controls, Principal launched an intelligent self-service solution rooted in generative AI. As a leader in financial services, Principal wanted to make sure all data and responses adhered to strict risk management and responsible AI guidelines.

Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Trending Sources

article thumbnail

EBSCOlearning scales assessment generation for their online learning content with generative AI

AWS Machine Learning - AI

In this post, we illustrate how EBSCOlearning partnered with AWS Generative AI Innovation Center (GenAIIC) to use the power of generative AI in revolutionizing their learning assessment process. This multifaceted approach makes sure that the questions adhere to all quality standards and guidelines.

article thumbnail

Responsible AI in action: How Data Reply red teaming supports generative AI safety on AWS

AWS Machine Learning - AI

Generative AI is rapidly reshaping industries worldwide, empowering businesses to deliver exceptional customer experiences, streamline processes, and push innovation at an unprecedented scale. Specifically, we discuss Data Replys red teaming solution, a comprehensive blueprint to enhance AI safety and responsible AI practices.

article thumbnail

Introducing AWS MCP Servers for code assistants (Part 1)

AWS Machine Learning - AI

Were excited to announce the open source release of AWS MCP Servers for code assistants a suite of specialized Model Context Protocol (MCP) servers that bring Amazon Web Services (AWS) best practices directly to your development workflow. This post is the first in a series covering AWS MCP Servers.

AWS 119
article thumbnail

Accelerating insurance policy reviews with generative AI: Verisk’s Mozart companion

AWS Machine Learning - AI

At the forefront of using generative AI in the insurance industry, Verisks generative AI-powered solutions, like Mozart, remain rooted in ethical and responsible AI use. The new Mozart companion is built using Amazon Bedrock. In the future, Verisk intends to use the Amazon Titan Embeddings V2 model.

article thumbnail

Model customization, RAG, or both: A case study with Amazon Nova

AWS Machine Learning - AI

As businesses and developers increasingly seek to optimize their language models for specific tasks, the decision between model customization and Retrieval Augmented Generation (RAG) becomes critical. Our study used Amazon Nova Micro and Amazon Nova Lite as baseline FMs and tested their performance across different configurations.