Remove Best practices Remove Big data Remove Engineering
article thumbnail

Generating value from enterprise data: Best practices for Text2SQL and generative AI

AWS Machine Learning

In this post, we provide an introduction to text to SQL (Text2SQL) and explore use cases, challenges, design patterns, and best practices. Today, a large amount of data is available in traditional data analytics, data warehousing, and databases, which may be not easy to query or understand for the majority of organization members.

article thumbnail

Ground truth curation and metric interpretation best practices for evaluating generative AI question answering using FMEval

AWS Machine Learning

In this post, we discuss best practices for working with FMEval in ground truth curation and metric interpretation for evaluating question answering applications for factual knowledge and quality. Ground truth data in AI refers to data that is known to be true, representing the expected outcome for the system being modeled.

Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Trending Sources

article thumbnail

Designing generative AI workloads for resilience

AWS Machine Learning

There are unique considerations when engineering generative AI workloads through a resilience lens. Make sure to validate prompt input data and prompt input size for allocated character limits that are defined by your model. If you’re performing prompt engineering, you should persist your prompts to a reliable data store.

article thumbnail

Achieve operational excellence with well-architected generative AI solutions using Amazon Bedrock

AWS Machine Learning

Managing bias, intellectual property, prompt safety, and data integrity are critical considerations when deploying generative AI solutions at scale. Because this is an emerging area, best practices, practical guidance, and design patterns are difficult to find in an easily consumable basis.

article thumbnail

MLOps deployment best practices for real-time inference model serving endpoints with Amazon SageMaker

AWS Machine Learning

After spending many hours in a materials research lab, his background in chemical engineering was quickly left behind to pursue his interest in machine learning. Shelbee is a co-creator and instructor of the Practical Data Science specialization on Coursera. Clay Elmore is an AI/ML Specialist Solutions Architect at AWS.

article thumbnail

How DPG Media uses Amazon Bedrock and Amazon Transcribe to enhance video metadata with AI-powered pipelines

AWS Machine Learning

About the Authors Lucas Desard is GenAI Engineer at DPG Media. Tom Lauwers is a machine learning engineer on the video personalization team for DPG Media. As the manager of the team, he guides ML and software engineers in building recommendation systems and generative AI solutions for the company.

article thumbnail

Generate financial industry-specific insights using generative AI and in-context fine-tuning

AWS Machine Learning

In this blog post, we demonstrate prompt engineering techniques to generate accurate and relevant analysis of tabular data using industry-specific language. This is done by providing large language models (LLMs) in-context sample data with features and labels in the prompt. For certain use cases, fine-tuning may be required.