Home Building an AI-Powered News Sentiment Analyzer with Streamlit
Post
Cancel

Building an AI-Powered News Sentiment Analyzer with Streamlit

In this article, I’ll walk you through the process of building an AI-Powered News Sentiment Analyzer using Python, Streamlit, and Natural Language Processing (NLP) tools. This app fetches recent news articles from Google News RSS, analyzes their sentiment, generates summaries, and visualizes the most frequent words using a word cloud. By the end of this article, you’ll have a fully functional web app that you can deploy and share with others.


Table of Contents

  1. Introduction
  2. Tools and Technologies Used
  3. Step 1: Fetching News Articles
  4. Step 2: Sentiment Analysis with TextBlob
  5. Step 3: Generating Summaries
  6. Step 4: Creating a Word Cloud
  7. Step 5: Building the Streamlit App
  8. Step 6: Adding an About Us Page
  9. Step 7: Deploying the App on Streamlit Cloud
  10. Conclusion

1. Introduction

The WanNkan: AI-Powered News Sentiment Analyzer is a web application that helps users stay informed by analyzing the sentiment of recent news articles. It provides:

  • Sentiment Analysis: Determines if an article is positive, negative, or neutral.
  • Summaries: Generates a short summary of each article.
  • Word Cloud: Visualizes the most frequent words in the articles.

This app is built using Streamlit, a powerful framework for creating data apps, and leverages TextBlob for sentiment analysis and WordCloud for visualization.


2. Tools and Technologies Used

  • Python: The programming language used for the app.
  • Streamlit: A framework for building interactive web apps.
  • TextBlob: A Python library for processing textual data and performing sentiment analysis.
  • WordCloud: A library for generating word clouds.
  • Feedparser: A library for parsing RSS feeds.
  • Matplotlib: A plotting library for creating visualizations.

3. Step 1: Fetching News Articles

The first step is to fetch news articles from Google News RSS. We use the feedparser library to parse the RSS feed and extract relevant information such as the title, description, and link.

1
2
3
4
5
6
import feedparser

def fetch_google_news_rss(topic):
    url = f'https://news.google.com/rss/search?q={topic}&hl=en-US&gl=US&ceid=US:en'
    feed = feedparser.parse(url)
    return [{'title': entry.title, 'description': entry.get('description', ''), 'link': entry.link} for entry in feed.entries]

Explanation

  • The fetch_google_news_rss function takes a topic as input and constructs the Google News RSS URL.
  • It uses feedparser to parse the RSS feed and extract the title, description, and link of each article.

4. Step 2: Sentiment Analysis with TextBlob

Next, we analyze the sentiment of each article using TextBlob. TextBlob provides a simple API for performing sentiment analysis, which returns a polarity score indicating whether the text is positive, negative, or neutral.

1
2
3
4
5
6
7
8
9
from textblob import TextBlob

def analyze_sentiment(text):
    analysis = TextBlob(text)
    if analysis.sentiment.polarity > 0:
        return 'Positive'
    elif analysis.sentiment.polarity < 0:
        return 'Negative'
    return 'Neutral'

Explanation

  • The analyze_sentiment function takes a text input and uses TextBlob to determine its sentiment.
  • The polarity score ranges from -1 (negative) to 1 (positive). A score of 0 indicates neutral sentiment.

5. Step 3: Generating Summaries

After analyzing the sentiment, we generate a summary for each article. The summary includes the headline, sentiment, and a snippet of the article’s content.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
def generate_summary(articles):
    summaries = []
    all_text = " ".join(clean_description(article['description']) or article['title'] for article in articles)
    for article in articles:
        content = clean_description(article['description']) or article['title']
        summary = f"""
        Headline: {article['title']}  

        Sentiment: {analyze_sentiment(content)}  

        Summary: {content[:200]}...  

        ---
        """
        summaries.append(summary)
    return summaries, all_text

Explanation

  • The generate_summary function creates a summary for each article, including the headline, sentiment, and a snippet of the content.
  • It also collects all the text from the articles to generate a word cloud later.

6. Step 4: Creating a Word Cloud

To visualize the most frequent words in the articles, we use the WordCloud library. The word cloud provides a quick overview of the main topics discussed in the news articles.

1
2
3
4
5
6
7
8
9
from wordcloud import WordCloud
import matplotlib.pyplot as plt

def generate_word_cloud(text):
    wordcloud = WordCloud(width=800, height=400, background_color='white').generate(text)
    plt.figure(figsize=(10, 5))
    plt.imshow(wordcloud, interpolation='bilinear')
    plt.axis('off')
    st.pyplot(plt)

Explanation

  • The generate_word_cloud function creates a word cloud from the combined text of all articles.
  • The word cloud is displayed using matplotlib and integrated into the Streamlit app with st.pyplot.

7. Step 5: Building the Streamlit App

Now that we have all the components, we integrate them into a Streamlit app. Streamlit makes it easy to create interactive web apps with minimal code.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
import streamlit as st

def main():
    st.title("WanNkan: AI-Powered News Sentiment Analyzer")
    st.write("This app fetches recent news articles, analyzes their sentiment, and generates a summary with a word cloud.")

    # User input for the topic
    topic = st.text_input("Enter a topic (e.g., technology, politics, sports):", "technology")

    if st.button("Analyze News"):
        st.write(f"Fetching and analyzing news articles for: **{topic}**")

        # Fetch news articles
        articles = fetch_google_news_rss(topic)
        if not articles:
            st.error("No articles found. Please try a different topic.")
        else:
            # Generate summaries and word cloud
            summaries, all_text = generate_summary(articles)
            
            # Display summaries
            st.subheader("News Summaries:")
            for summary in summaries:
                st.markdown(summary)
            
            # Display word cloud
            st.subheader("Word Cloud:")
            generate_word_cloud(all_text)

Explanation

  • The main function is the entry point of the Streamlit app.
  • It provides a user interface for entering a topic and clicking a button to analyze news articles.
  • The app displays the summaries and word cloud after fetching and analyzing the articles.

8. Step 6: Adding an About Us Page

To provide more context about the app and the team, we add an About Us page in the sidebar.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
def about_us():
    st.sidebar.title("About Us")
    st.sidebar.markdown("""
    Welcome to the **WanNkan: AI-Powered News Sentiment Analyzer**! This app is designed to help you stay informed by analyzing the sentiment of recent news articles and providing a quick summary.

    ### How It Works:
    1. Enter a topic of interest (e.g., technology, politics, sports).  
    2. The app fetches the latest news articles from Google News RSS.  
    3. It analyzes the sentiment of each article (Positive, Negative, or Neutral).  
    4. A summary and a word cloud are generated to give you insights at a glance.  

    ### Meet the Team:
    - **Ayodeji Adesegun**: Chief Mathematical Officer (CMO) 
    

    ### Contact Us:
    Have questions or feedback? Reach out to us at [contact@newsanalyzer.com](mailto:contact@newsanalyzer.com).
    """)

Explanation

  • The about_us function displays information about the app and the team in the sidebar.
  • It uses Markdown for formatting and includes links for contact.

9. Step 7: Deploying the App on Streamlit Cloud

To share the app with others, we deploy it on Streamlit Cloud. Follow these steps:

  1. Push the code to a GitHub repository.
  2. Go to Streamlit Cloud and connect your GitHub account.
  3. Select the repository and branch, then click “Deploy”.

10. Conclusion

In this article, we built an AI-Powered News Sentiment Analyzer using Python, Streamlit, and NLP tools. The app fetches news articles, analyzes their sentiment, generates summaries, and visualizes the most frequent words using a word cloud. With Streamlit, we created an interactive web app that can be easily deployed and shared.

This project demonstrates the power of combining NLP and web frameworks to create useful and engaging applications. You can extend this app by adding more features, such as filtering by date or source, or using more advanced NLP models for sentiment analysis.


Demo

You can try the app live on Streamlit Cloud:

Streamlit App


Final Code

You can find the complete code for this project on GitHub. Feel free to fork the repository and customize the app to suit your needs.

This post is licensed under CC BY 4.0 by the author.