Remove APIs Remove Best practices Remove Government
article thumbnail

Governing ML lifecycle at scale: Best practices to set up cost and usage visibility of ML workloads in multi-account environments

AWS Machine Learning

This post outlines steps you can take to implement a comprehensive tagging governance strategy across accounts, using AWS tools and services that provide visibility and control. Tagging is an effective scaling mechanism for implementing cloud management and governance strategies.

article thumbnail

Centralize model governance with SageMaker Model Registry Resource Access Manager sharing

AWS Machine Learning

Customers can use the SageMaker Studio UI or APIs to specify the SageMaker Model Registry model to be shared and grant access to specific AWS accounts or to everyone in the organization. This streamlines the ML workflows, enables better visibility and governance, and accelerates the adoption of ML models across the organization.

Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Trending Sources

article thumbnail

Use the ApplyGuardrail API with long-context inputs and streaming outputs in Amazon Bedrock

AWS Machine Learning

The new ApplyGuardrail API enables you to assess any text using your preconfigured guardrails in Amazon Bedrock, without invoking the FMs. In this post, we demonstrate how to use the ApplyGuardrail API with long-context inputs and streaming outputs. For example, you can now use the API with models hosted on Amazon SageMaker.

APIs 135
article thumbnail

Use Amazon SageMaker Model Card sharing to improve model governance

AWS Machine Learning

The framework that gives systematic visibility into ML model development, validation, and usage is called ML governance. During AWS re:Invent 2022, AWS introduced new ML governance tools for Amazon SageMaker which simplifies access control and enhances transparency over your ML projects.

article thumbnail

Best practices to build generative AI applications on AWS

AWS Machine Learning

Amazon Bedrock is a fully managed service that offers a choice of high-performing foundation models (FMs) from leading AI companies like AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability AI, and Amazon via a single API. This improves efficiency and allows larger contexts to be used. This supports safer adoption.

article thumbnail

Achieve operational excellence with well-architected generative AI solutions using Amazon Bedrock

AWS Machine Learning

However, scaling up generative AI and making adoption easier for different lines of businesses (LOBs) comes with challenges around making sure data privacy and security, legal, compliance, and operational complexities are governed on an organizational level. In this post, we discuss how to address these challenges holistically.

article thumbnail

Best practices for building robust generative AI applications with Amazon Bedrock Agents – Part 1

AWS Machine Learning

This two-part series explores best practices for building generative AI applications using Amazon Bedrock Agents. This data provides a benchmark for expected agent behavior, including the interaction with existing APIs, knowledge bases, and guardrails connected with the agent.