Constructing an Earnings Report Agent with Swarm Framework

Think about in case you might automate the tedious process of analyzing earnings reviews, extracting key insights, and making knowledgeable suggestions—all with out lifting a finger. On this article, we’ll stroll you thru easy methods to create a multi-agent system utilizing OpenAI’s Swarm framework, designed to deal with these actual duties. You’ll learn to arrange and orchestrate three specialised brokers: one to summarize earnings reviews, one other to investigate sentiment, and a 3rd to generate actionable suggestions. By the tip of this tutorial, you’ll have a scalable, modular resolution to streamline monetary evaluation, with potential functions past simply earnings reviews.

Studying Outcomes

  • Perceive the basics of OpenAI’s Swarm framework for multi-agent programs.
  • Learn to create brokers for summarizing, sentiment evaluation, and proposals.
  • Discover using modular brokers for earnings report evaluation.
  • Securely handle API keys utilizing a .env file.
  • Implement a multi-agent system to automate earnings report processing.
  • Achieve insights into real-world functions of multi-agent programs in finance.
  • Arrange and execute a multi-agent workflow utilizing OpenAI’s Swarm framework.

This text was printed as part of the Knowledge Science Blogathon.

Building an Earnings Report Agent with Swarm Framework

What’s OpenAI’s Swarm?

Swarm is a light-weight, experimental framework from OpenAI that focuses on multi-agent orchestration. It permits us to coordinate a number of brokers, every dealing with particular duties, like summarizing content material, performing sentiment evaluation, or recommending actions. In our case, we’ll design three brokers:

  • Abstract Agent: Gives a concise abstract of the earnings report.
  • Sentiment Agent: Analyzes sentiment from the report.
  • Advice Agent: Recommends actions primarily based on sentiment evaluation.

Use Circumstances and Advantages of Multi-Agent Methods

You’ll be able to increase the multi-agent system constructed right here for varied use instances.

  • Portfolio Administration: Automate monitoring of a number of firm reviews and counsel portfolio modifications primarily based on sentiment traits.
  • Information Summarization for Finance: Combine real-time information feeds with these brokers to detect potential market actions early.
  • Sentiment Monitoring: Use sentiment evaluation to foretell inventory actions or crypto traits primarily based on constructive or damaging market information.

By splitting duties into modular brokers, you’ll be able to reuse particular person parts throughout completely different tasks, permitting for flexibility and scalability.

Step 1: Setting Up Your Mission Atmosphere

Earlier than we dive into coding, it’s important to put a strong basis for the challenge. On this step, you’ll create the required folders and recordsdata and set up the required dependencies to get every part working easily.

mkdir earnings_report
cd earnings_report
mkdir brokers utils
contact most important.py brokers/__init__.py utils/__init__.py .gitignore

Set up Dependencies

pip set up git+https://github.com/openai/swarm.git openai python-dotenv

Step 2: Retailer Your API Key Securely

Safety is vital, particularly when working with delicate information like API keys. This step will information you on easy methods to retailer your OpenAI API key securely utilizing a .env file, making certain your credentials are protected and sound.

OPENAI_API_KEY=your-openai-api-key-here

This ensures your API key isn’t uncovered in your code.

Step 3: Implement the Brokers

Now, it’s time to deliver your brokers to life! On this step, you’ll create three separate brokers: one for summarizing the earnings report, one other for sentiment evaluation, and a 3rd for producing actionable suggestions primarily based on the sentiment.

Abstract Agent

The Abstract Agent will extract the primary 100 characters of the earnings report as a abstract.

Create brokers/summary_agent.py:

from swarm import Agent

def summarize_report(context_variables):
    report_text = context_variables["report_text"]
    return f"Abstract: {report_text[:100]}..."

summary_agent = Agent(
    identify="Abstract Agent",
    directions="Summarize the important thing factors of the earnings report.",
    capabilities=[summarize_report]
)

Sentiment Agent

This agent will verify if the phrase “revenue” seems within the report to find out if the sentiment is constructive.

Create brokers/sentiment_agent.py:

from swarm import Agent

def analyze_sentiment(context_variables):
    report_text = context_variables["report_text"]
    sentiment = "constructive" if "revenue" in report_text else "damaging"
    return f"The sentiment of the report is: {sentiment}"

sentiment_agent = Agent(
    identify="Sentiment Agent",
    directions="Analyze the sentiment of the report.",
    capabilities=[analyze_sentiment]
)

Advice Agent

Based mostly on the sentiment, this agent will counsel “Purchase” or “Maintain”.

Create brokers/recommendation_agent.py:

from swarm import Agent

def generate_recommendation(context_variables):
    sentiment = context_variables["sentiment"]
    suggestion = "Purchase" if sentiment == "constructive" else "Maintain"
    return f"My suggestion is: {suggestion}"

recommendation_agent = Agent(
    identify="Advice Agent",
    directions="Suggest actions primarily based on the sentiment evaluation.",
    capabilities=[generate_recommendation]
)

Step 4: Add a Helper Operate for File Loading

Loading information effectively is a essential a part of any challenge. Right here, you’ll create a helper operate to streamline the method of studying and loading the earnings report file, making it simpler in your brokers to entry the info.

def load_earnings_report(filepath):
    with open(filepath, "r") as file:
        return file.learn()

Step 5: Tie All the pieces Collectively in most important.py

Along with your brokers prepared, it’s time to tie every part collectively. On this step, you’ll write the principle script that orchestrates the brokers, permitting them to work in concord to investigate and supply insights on the earnings report.

from swarm import Swarm
from brokers.summary_agent import summary_agent
from brokers.sentiment_agent import sentiment_agent
from brokers.recommendation_agent import recommendation_agent
from utils.helpers import load_earnings_report
import os
from dotenv import load_dotenv

# Load setting variables from the .env file
load_dotenv()

# Set the OpenAI API key from the setting variable
os.environ['OPENAI_API_KEY'] = os.getenv('OPENAI_API_KEY')

# Initialize Swarm shopper
shopper = Swarm()

# Load earnings report
report_text = load_earnings_report("sample_earnings.txt")

# Run abstract agent
response = shopper.run(
    agent=summary_agent,
    messages=[{"role": "user", "content": "Summarize the report"}],
    context_variables={"report_text": report_text}
)
print(response.messages[-1]["content"])

# Cross abstract to sentiment agent
response = shopper.run(
    agent=sentiment_agent,
    messages=[{"role": "user", "content": "Analyze the sentiment"}],
    context_variables={"report_text": report_text}
)
print(response.messages[-1]["content"])

# Extract sentiment and run suggestion agent
sentiment = response.messages[-1]["content"].break up(": ")[-1].strip()
response = shopper.run(
    agent=recommendation_agent,
    messages=[{"role": "user", "content": "Give a recommendation"}],
    context_variables={"sentiment": sentiment}
)
print(response.messages[-1]["content"])

Step 6: Create a Pattern Earnings Report

To check your system, you want information! This step exhibits you easy methods to create a pattern earnings report that your brokers can course of, making certain every part is prepared for motion.

Firm XYZ reported a 20% improve in earnings in comparison with the earlier quarter. 
Gross sales grew by 15%, and the corporate expects continued development within the subsequent fiscal yr.

Step 7: Run the Program

Now that every part is ready up, it’s time to run this system and watch your multi-agent system in motion because it analyzes the earnings report, performs sentiment evaluation, and presents suggestions.

python most important.py

Anticipated Output:

Run the Program

Conclusion

We’ve constructed a multi-agent resolution utilizing OpenAI’s Swarm framework to automate the evaluation of earnings reviews. We are able to course of monetary data and provide actionable suggestions with just some brokers. You’ll be able to simply lengthen this resolution by including new brokers for deeper evaluation or integrating real-time monetary APIs.

Strive it your self and see how one can improve it with extra information sources or brokers for extra superior evaluation!

Key Takeaways

  • Modular Structure: Breaking the system into a number of brokers and utilities retains the code maintainable and scalable.
  • Swarm Framework Energy: Swarm permits easy handoffs between brokers, making it straightforward to construct advanced multi-agent workflows.
  • Safety through .env: Managing API keys with dotenv ensures that delicate information isn’t hardcoded into the challenge.
  • This challenge can increase to deal with reside monetary information by integrating APIs, enabling it to offer real-time suggestions for buyers.

Incessantly Requested Questions

Q1. What’s OpenAI’s Swarm framework?

A. OpenAI’s Swarm is an experimental framework designed for coordinating a number of brokers to carry out particular duties. It’s ultimate for constructing modular programs the place every agent has an outlined function, comparable to summarizing content material, performing sentiment evaluation, or producing suggestions.

Q2. What are the important thing parts of a multi-agent system?

A. On this tutorial, the multi-agent system consists of three key brokers: the Abstract Agent, Sentiment Agent, and Advice Agent. Every agent performs a selected operate like summarizing an earnings report, analyzing its sentiment, or recommending actions primarily based on sentiment.

Q3. How do I safe my OpenAI API key on this challenge?

A. You’ll be able to retailer your API key securely in a .env file. This fashion, the API key isn’t uncovered straight in your code, sustaining safety. The .env file may be loaded utilizing the python-dotenv bundle.

This fall. Can I increase this challenge to deal with reside monetary information?

A. Sure, the challenge may be prolonged to deal with reside information by integrating monetary APIs. You’ll be able to create extra brokers to fetch real-time earnings reviews and analyze traits to offer up-to-date suggestions.

Q5. Can I reuse the brokers in different tasks?

A. Sure, the brokers are designed to be modular, so you’ll be able to reuse them in different tasks. You’ll be able to adapt them to completely different duties comparable to summarizing information articles, performing textual content sentiment evaluation, or making suggestions primarily based on any type of structured information.

The media proven on this article isn’t owned by Analytics Vidhya and is used on the Writer’s discretion.

Hello,
I’m a licensed TensorFlow Developer, GCP Affiliate Engineer, and GCP Machine Studying Engineer.

When it comes to GCP data, I’ve expertise working with varied GCP providers comparable to Compute Engine, Kubernetes Engine, App Engine, Cloud Storage, BigQuery, and Cloud SQL. I’ve expertise with cloud-native information processing instruments comparable to Dataflow and Apache Beam. I’m additionally proficient in utilizing Cloud SDK and Cloud Shell for deploying and managing GCP assets. I’ve hands-on expertise in organising and managing GCP tasks, creating and managing digital machines, configuring load balancers, and managing storage.

When it comes to machine studying, I’ve expertise working with a variety of algorithms, together with supervised and unsupervised studying, deep studying, and pure language processing. I’ve additionally labored on quite a lot of tasks, together with picture classification, sentiment evaluation, and predictive modeling.

As for net scraping, I’ve expertise utilizing quite a lot of instruments and libraries, together with Scrapy, BeautifulSoup, and Selenium. I’ve additionally labored with APIs and may deal with information cleansing, preprocessing, and visualization.

Thanks in your time.