article thumbnail

Amazon Bedrock Guardrails announces IAM Policy-based enforcement to deliver safe AI interactions

AWS Machine Learning

Beyond Amazon Bedrock models, the service offers the flexible ApplyGuardrails API that enables you to assess text using your pre-configured guardrails without invoking FMs, allowing you to implement safety controls across generative AI applicationswhether running on Amazon Bedrock or on other systemsat both input and output levels.

article thumbnail

Integrate generative AI capabilities into Microsoft Office using Amazon Bedrock

AWS Machine Learning

Note that these APIs use objects as namespaces, alleviating the need for explicit imports. API Gateway supports multiple mechanisms for controlling and managing access to an API. AWS Lambda handles the REST API integration, processing the requests and invoking the appropriate AWS services.

APIs 95
Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Trending Sources

article thumbnail

Security best practices to consider while fine-tuning models in Amazon Bedrock

AWS Machine Learning

By employing a VPC interface endpoint, you can make sure communication between your VPC and the Amazon Bedrock API endpoint occurs through a PrivateLink connection, rather than through the public internet. The following code is a sample resource policy. Provide your account, bucket name, and VPC settings. Choose Create roles.

article thumbnail

Enable Amazon Bedrock cross-Region inference in multi-account environments

AWS Machine Learning

Importantly, cross-Region inference prioritizes the connected Amazon Bedrock API source Region when possible, helping minimize latency and improve overall responsiveness. v2 using the Amazon Bedrock console or the API by assuming the custom IAM role mentioned in the previous step ( Bedrock-Access-CRI ).

article thumbnail

Use AWS PrivateLink to set up private access to Amazon Bedrock

AWS Machine Learning

It allows developers to build and scale generative AI applications using FMs through an API, without managing infrastructure. Customers are building innovative generative AI applications using Amazon Bedrock APIs using their own proprietary data.

APIs 140
article thumbnail

Secure Amazon SageMaker Studio presigned URLs Part 2: Private API with JWT authentication

AWS Machine Learning

In this post, we will continue to build on top of the previous solution to demonstrate how to build a private API Gateway via Amazon API Gateway as a proxy interface to generate and access Amazon SageMaker presigned URLs. The user invokes createStudioPresignedUrl API on API Gateway along with a token in the header.

APIs 95
article thumbnail

Fine-tune LLMs with synthetic data for context-based Q&A using Amazon Bedrock

AWS Machine Learning

Amazon Bedrock is a fully managed service that makes FMs from leading AI startups and Amazon available through an API, so you can choose from a wide range of FMs to find the model that is best suited for your use case. Solution overview The solution comprises two main steps: Generate synthetic data using the Amazon Bedrock InvokeModel API.

APIs 74