跳转到主要内容
Chinese, Simplified

category

今天,我们宣布在Amazon Bedrock Flows中公开预览长期执行(异步)流支持。借助亚马逊基岩流,您可以将基础模型(FM)、亚马逊基岩提示管理、亚马逊基岩代理、亚马逊基岩知识库、亚马逊基岩护栏和其他AWS服务链接在一起,以构建和扩展预定义的生成式AI工作流程。

随着各行各业的客户构建越来越复杂的应用程序,他们分享了关于需要处理更大数据集和运行复杂工作流的反馈,这些工作流需要花费几分钟以上的时间才能完成。许多客户告诉我们,他们希望转换整本书,处理大量文档,并编排多步骤的人工智能工作流程,而不必担心运行时间限制,这突显了对能够处理长时间运行的后台任务的解决方案的需求。为了解决这些问题,Amazon Bedrock Flows在公共预览中引入了一项新功能,将工作流执行时间从5分钟(同步)延长到24小时(异步)。

借助Amazon Bedrock的长时间运行执行流(异步),您可以将多个提示、人工智能服务和Amazon Bedrock组件链接在一起,形成复杂的长时间工作流程(异步长达24小时)。新功能包括直接使用AWS管理控制台和Amazon Bedrock Flow API进行可观察性内置执行跟踪。这些增强功能显著简化了Amazon Bedrock Flows中的工作流开发和管理,帮助您专注于构建和部署生成式AI应用程序。

通过将工作流执行(通过可运行长达24小时的长时间运行的流异步执行)与用户的即时交互解耦,您现在可以构建可以处理需要5分钟以上处理的大型有效负载的应用程序,执行资源密集型任务,应用多个决策规则,甚至在与多个系统集成的同时在后台运行流,同时为用户提供无缝和响应迅速的体验。

解决方案概述

使用Amazon Bedrock Flows的组织现在可以使用长期运行的执行流功能来设计和部署长期运行的工作流,以构建更可扩展、更高效的生成式AI应用程序。此功能提供以下好处:

  • 长时间运行的工作流-您可以将长时间运行(长达24小时)的工作流作为后台任务运行,并将工作流执行与用户的即时交互解耦。
  • 大有效载荷-该功能支持大有效载荷处理和资源密集型任务,这些任务可以持续24小时,而不是目前的5分钟限制。
  • 复杂用例-它可以管理集成多个外部系统的复杂、多步骤决策生成人工智能工作流的执行。
  • 对构建者友好–您可以通过Amazon Bedrock API和Amazon Bedrock控制台创建和管理长时间运行的执行流。
  • 可观察性——您可以享受无缝的用户体验,能够检查流执行状态并相应地检索结果。该功能还提供跟踪,以便您可以查看每个节点的输入和输出。

电通是一家领先的广告公司和创意巨头,需要处理复杂的、多步骤的生成人工智能用例,这些用例需要更长的执行时间。一个用例是他们的Easy Reading应用程序,它将包含许多章节和插图的书籍转换为易于阅读的格式,使智障人士能够访问文献。借助Amazon Bedrock的长期执行流程,电通现在可以:

  • 在工作流中处理大量输入并执行复杂的资源密集型任务。在长时间运行的执行流之前,由于流的执行时间限制为5分钟,输入大小受到限制。
  • 将多个外部系统和服务集成到生成式AI工作流程中。
  • 支持快速、近乎实时的工作流和运行时间更长、更复杂的工作流。

Dentsu Creative Brazil创新总监Victoria Aiello表示:“Amazon Bedrock与我们的客户合作并向他们展示价值,令人惊叹。”。“使用跟踪和流,我们能够展示人工智能正在执行的工作背后的处理过程,使我们对要生成的内容有更好的可见性和准确性。对于Easy Reading用例,长时间运行的执行流将允许一次性处理整本书,利用24小时的流执行时间,而不是编写自定义代码来分别管理书的多个部分。这节省了我们制作新书甚至与不同模型集成的时间;我们可以根据每本书的需求或内容测试不同的结果。”

让我们来探索一下Amazon Bedrock Flows中新的长时间运行的执行流功能如何使电通能够构建一个更高效、更长期的图书处理生成人工智能应用程序。下图说明了电通图书处理应用程序的端到端流程。当客户端将书籍上传到Amazon Simple Storage Service(Amazon S3)时,该过程开始,触发一个处理多个章节的流程,其中每个章节都根据特定的用户要求进行可访问性转换和格式化。然后收集转换后的章节,将其与目录合并,并作为最终可访问的文档存储在Amazon S3中。这种长时间运行的执行(异步)流可以有效地处理大型书籍,在24小时执行窗口内处理它们,同时在整个转换过程中提供状态更新和可追溯性。

In the following sections, we demonstrate how to create a long-running execution flow in Amazon Bedrock Flows using Dentsu’s real-world use case of books transformation.

Prerequisites

Before implementing the new capabilities, make sure you have the following:

After these components are in place, you can implement Amazon Bedrock long-running execution flow capabilities in your generative AI use case.

Create a long-running execution flow

Complete the following steps to create your long-running execution flow:

  1. On the Amazon Bedrock console, in the navigation pane under Builder tools, choose Flows.
  2. Choose Create a flow.
  3. Provide a name for your a new flow, for example, easy-read-long-running-flow.

For detailed instructions on creating a flow, see Amazon Bedrock Flows is now generally available with enhanced safety and traceability. Amazon Bedrock provides different node types to build your prompt flow.

The following screenshot shows the high-level flow of Dentsu’s book conversion generative AI-powered application. The workflow demonstrates a sequential process from input handling through content transformation to final storage and delivery.

AWS Bedrock Flow Builder interface displaying easy-read-long-running-flow with connected components for document processing and storage

The following table outlines the core components and nodes within the preceding workflow, designed for document processing and accessibility transformation.

Node Purpose
Flow Input Entry point accepting an array of S3 prefixes (chapters) and accessibility profile
Iterator Processes each chapter (prefix) individually
S3 Retrieval Downloads chapter content from the specified Amazon S3 location
Easifier Applies accessibility transformation rules to chapter content
HTML Formatter Formats transformed content with appropriate HTML structure
Collector Assembles transformed chapters while maintaining order
Lambda Function Combines chapters into a single document with table of contents
S3 Storage Stores the final transformed document in Amazon S3
Flow Output Returns Amazon S3 location of the transformed book with metadata

Test the book processing flow

We are now ready to test the flow through the Amazon Bedrock console or API. We use a fictional book called “Beyond Earth: Humanity’s Journey to the Stars.” This book tells the story of humanity’s greatest adventure beyond our home planet, tracing our journey from the first satellites and moonwalks to space stations and robotic explorers that continue to unveil the mysteries of our solar system.

  1. On the Amazon Bedrock console, choose Flows in the navigation pane.
  2. Choose the flow (easy-read-long-running-flow) and choose Create execution.

The flow must be in the Prepared state before creating an execution.

The Execution tab shows the previous executions for the selected flow.

AWS Bedrock Flow details page showing flow configuration and execution status

  1. Provide the following input:

dyslexia test input

{
  "chapterPrefixes": [
    "books/beyond-earth/chapter_1.txt",
    "books/beyond-earth/chapter_2.txt",
    "books/beyond-earth/chapter_3.txt"
  ],
  "metadata": {
    "accessibilityProfile": "dyslexia",
    "bookId": "beyond-earth-002",
    "bookTitle": "Beyond Earth: Humanity's Journey to the Stars"
  }
}

These are the different chapters of our book that need to be transformed.

  1. Choose Create.

AWS Bedrock Flow execution setup modal with name, alias selection, and JSON configuration for book processing

Amazon Bedrock Flows initiates the long-running execution (asynchronous) flow of our workflow. The dashboard displays the executions of our flow with their respective statuses (Running, Succeeded, Failed, TimedOut, Aborted). When an execution is marked as Completed, the results become available in our designated S3 bucket.

AWS Bedrock Flow dashboard displaying flow details and active execution status for easy-read implementation

Choosing an execution takes you to the summary page containing its details. The Overview section displays start and end times, plus the execution Amazon Resource Name (ARN)—a unique identifier that’s essential for troubleshooting specific executions later.

AWS execution interface with status, summary details, and workflow diagram of connected services

When you select a node in the flow builder, its configuration details appear. For instance, choosing the Easifier node reveals the prompt used, the selected model (here it’s Amazon Nova Lite), and additional configuration parameters. This is essential information for understanding how that specific component is set up.

AWS Bedrock interface with LLM settings, prompt configuration, and service workflow visualization

The system also provides access to execution traces, offering detailed insights into each processing step, tracking real-time performance metrics, and highlighting issues that occurred during the flow’s execution. Traces can be enabled using the API and sent to an Amazon CloudWatch log. In the API, set the enableTrace field to true in an InvokeFlow request. Each flowOutputEvent in the response is returned alongside a flowTraceEvent.

AWS flow execution trace showing processing steps for book chapter conversion

We have now successfully created and executed a long-running execution flow. You can also use Amazon Bedrock APIs to programmatically start, stop, list, and get flow executions. For more details on how to configure flows with enhanced safety and traceability, refer to Amazon Bedrock Flows is now generally available with enhanced safety and traceability.

结论

亚马逊基岩流中长时间运行的执行流的集成代表了生成式人工智能开发的重大进步。借助这些功能,您可以创建更高效的AI驱动解决方案,以自动化长时间运行的操作,解决快速发展的AI应用程序开发领域的关键挑战。

Amazon Bedrock Flows中的长期运行执行流支持现已在AWS区域公开预览,AWS GovCloud(美国)区域除外。要开始使用,请打开Amazon Bedrock控制台或API,开始使用Amazon Bedrock flows构建长时间运行的执行流。要了解更多信息,请参阅在Amazon Bedrock中创建您的第一个流,并通过在Amazon Bedrock中查看其轨迹来跟踪流中的每一步。

我们很高兴看到您将利用这些新功能构建的创新应用程序。与往常一样,我们欢迎您通过AWS re:Post for Amazon Bedrock或您通常的AWS联系人提供反馈。加入community.aws的生成式人工智能构建器社区,分享您的经验并向他人学习。


 

本文地址
最后修改
星期一, 九月 22, 2025 - 10:26
Tags
 
Article