Remove APIs Remove Best practices Remove Events
article thumbnail

Use the ApplyGuardrail API with long-context inputs and streaming outputs in Amazon Bedrock

AWS Machine Learning

The new ApplyGuardrail API enables you to assess any text using your preconfigured guardrails in Amazon Bedrock, without invoking the FMs. In this post, we demonstrate how to use the ApplyGuardrail API with long-context inputs and streaming outputs. For example, you can now use the API with models hosted on Amazon SageMaker.

APIs 114
article thumbnail

Best practices for building robust generative AI applications with Amazon Bedrock Agents – Part 2

AWS Machine Learning

In Part 1 of this series, we explored best practices for creating accurate and reliable agents using Amazon Bedrock Agents. The agent can use company APIs and external knowledge through Retrieval Augmented Generation (RAG). If you already have an OpenAPI schema for your application, the best practice is to start with it.

Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Trending Sources

article thumbnail

Best practices for building robust generative AI applications with Amazon Bedrock Agents – Part 1

AWS Machine Learning

This two-part series explores best practices for building generative AI applications using Amazon Bedrock Agents. This data provides a benchmark for expected agent behavior, including the interaction with existing APIs, knowledge bases, and guardrails connected with the agent.

article thumbnail

Best practices to build generative AI applications on AWS

AWS Machine Learning

Amazon Bedrock is a fully managed service that offers a choice of high-performing foundation models (FMs) from leading AI companies like AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability AI, and Amazon via a single API. This is because such tasks require organization-specific data and workflows that typically need custom programming.

article thumbnail

Training large language models on Amazon SageMaker: Best practices

AWS Machine Learning

In this post, we dive into tips and best practices for successful LLM training on Amazon SageMaker Training. The post covers all the phases of an LLM training workload and describes associated infrastructure features and best practices. Some of the best practices in this post refer specifically to ml.p4d.24xlarge

article thumbnail

How to Communicate Effectively with Customers: Best Practices

Cincom

The key stages include: Initial Research: Early messaging should focus on educating prospects with thought leadership content detailing industry trends and best practices. Setting up workflows to activate events like purchases, milestones, or renewals enables sending personalized messages precisely when engagement potential is highest.

article thumbnail

Achieve operational excellence with well-architected generative AI solutions using Amazon Bedrock

AWS Machine Learning

Because this is an emerging area, best practices, practical guidance, and design patterns are difficult to find in an easily consumable basis. This integration makes sure enterprises can take advantage of the full power of generative AI while adhering to best practices in operational excellence.