Remove APIs Remove Government Remove Metrics
article thumbnail

Centralize model governance with SageMaker Model Registry Resource Access Manager sharing

AWS Machine Learning

Customers can use the SageMaker Studio UI or APIs to specify the SageMaker Model Registry model to be shared and grant access to specific AWS accounts or to everyone in the organization. This streamlines the ML workflows, enables better visibility and governance, and accelerates the adoption of ML models across the organization.

article thumbnail

Amazon Bedrock launches Session Management APIs for generative AI applications (Preview)

AWS Machine Learning

Amazon Bedrock announces the preview launch of Session Management APIs, a new capability that enables developers to simplify state and context management for generative AI applications built with popular open source frameworks such as LangGraph and LlamaIndex. Building generative AI applications requires more than model API calls.

APIs 105
Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Trending Sources

article thumbnail

Build a multi-tenant generative AI environment for your enterprise on AWS

AWS Machine Learning

We also dive deeper into access patterns, governance, responsible AI, observability, and common solution designs like Retrieval Augmented Generation. It also uses a number of other AWS services such as Amazon API Gateway , AWS Lambda , and Amazon SageMaker. API Gateway is serverless and hence automatically scales with traffic.

article thumbnail

Considerations for addressing the core dimensions of responsible AI for Amazon Bedrock applications

AWS Machine Learning

For now, we consider eight key dimensions of responsible AI: Fairness, explainability, privacy and security, safety, controllability, veracity and robustness, governance, and transparency. Regular evaluations allow you to adjust and steer the AI’s behavior based on feedback and performance metrics.

APIs 105
article thumbnail

Track LLM model evaluation using Amazon SageMaker managed MLflow and FMEval

AWS Machine Learning

Evaluation algorithm Computes evaluation metrics to model outputs. Different algorithms have different metrics to be specified. It functions as a standalone HTTP server that provides various REST API endpoints for monitoring, recording, and visualizing experiment runs. This allows you to keep track of your ML experiments.

article thumbnail

Principal Financial Group uses QnABot on AWS and Amazon Q Business to enhance workforce productivity with generative AI

AWS Machine Learning

Amazon Bedrock is a fully managed service that offers a choice of high-performing foundation models (FMs) from leading AI companies like AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability AI, and Amazon through a single API, along with a broad set of capabilities to build generative AI applications with security, privacy, and responsible AI.

Chatbots 114
article thumbnail

Governing the ML lifecycle at scale: Centralized observability with Amazon SageMaker and Amazon CloudWatch

AWS Machine Learning

This post is part of an ongoing series on governing the machine learning (ML) lifecycle at scale. To start from the beginning, refer to Governing the ML lifecycle at scale, Part 1: A framework for architecting ML workloads using Amazon SageMaker. Consolidate metrics across source accounts and build unified dashboards.