Remove Accountability Remove APIs Remove Government
article thumbnail

Governing ML lifecycle at scale: Best practices to set up cost and usage visibility of ML workloads in multi-account environments

AWS Machine Learning

For a multi-account environment, you can track costs at an AWS account level to associate expenses. A combination of an AWS account and tags provides the best results. Tagging is an effective scaling mechanism for implementing cloud management and governance strategies.

article thumbnail

Centralize model governance with SageMaker Model Registry Resource Access Manager sharing

AWS Machine Learning

We recently announced the general availability of cross-account sharing of Amazon SageMaker Model Registry using AWS Resource Access Manager (AWS RAM) , making it easier to securely share and discover machine learning (ML) models across your AWS accounts. Human oversight : Including human involvement in AI decision-making processes.

Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Trending Sources

article thumbnail

Govern generative AI in the enterprise with Amazon SageMaker Canvas

AWS Machine Learning

This is crucial for compliance, security, and governance. In this post, we analyze strategies for governing access to Amazon Bedrock and SageMaker JumpStart models from within SageMaker Canvas using AWS Identity and Access Management (IAM) policies. We provide code examples tailored to common enterprise governance scenarios.

article thumbnail

Improve governance of models with Amazon SageMaker unified Model Cards and Model Registry

AWS Machine Learning

You can now register machine learning (ML) models in Amazon SageMaker Model Registry with Amazon SageMaker Model Cards , making it straightforward to manage governance information for specific model versions directly in SageMaker Model Registry in just a few clicks.

article thumbnail

Governing the ML lifecycle at scale: Centralized observability with Amazon SageMaker and Amazon CloudWatch

AWS Machine Learning

This post is part of an ongoing series on governing the machine learning (ML) lifecycle at scale. To start from the beginning, refer to Governing the ML lifecycle at scale, Part 1: A framework for architecting ML workloads using Amazon SageMaker.

article thumbnail

Secure Amazon SageMaker Studio presigned URLs Part 3: Multi-account private API access to Studio

AWS Machine Learning

These customers need to balance governance, security, and compliance against the need for machine learning (ML) teams to quickly access their data science environments in a secure manner. One important aspect of this foundation is to organize their AWS environment following a multi-account strategy.

APIs 81
article thumbnail

Use the ApplyGuardrail API with long-context inputs and streaming outputs in Amazon Bedrock

AWS Machine Learning

The new ApplyGuardrail API enables you to assess any text using your preconfigured guardrails in Amazon Bedrock, without invoking the FMs. In this post, we demonstrate how to use the ApplyGuardrail API with long-context inputs and streaming outputs. For example, you can now use the API with models hosted on Amazon SageMaker.

APIs 124