
Photo by Editor | Chat GPT
Data analytics have changed. It is no longer enough to know the tools like Azigar, SQL, and Excel to become a data analyst.
As a data professional in a tech company, I am experiencing AI integration in every employee’s workflow. There is a sea of ​​AI tools that can now access and analyze your entire database and help build data analtics projects, machine learning models, and web applications in minutes.
If you are a wishing data professional and are not using these AI tools, you are losing. And soon, you will be left behind by other data analysts. People who are using AI to improve their workflows.
In this article, I will drive you through AI tools that will help you stay ahead of the competition and 10x your data analtics workflow.
With these tools, you can:
- Build and deploy creative portfolio projects to hire as a data analyst
- Use simple English to create data analytical applications from the end
- Accele your data workflow and become more efficient analysts of data
In addition, this article would be a step -by -step guide on how to use AI tools to build data analtics applications. We will especially focus on two AI tools – cursor and Pandas AI.
For the video version of this article, watch it:
https://www.youtube.com/watch?v=ukidrskagai
AI Toll 1: cursor
The cursor is an AI code editor that has access to your entire code base. You just have to point out in the cursor chat interface, and it will access all the files in your directory and edit the code for you.
If you are early and can’t write a line of code, you can also start with an empty code folder and ask the cursor to make something for you. After that the AI ​​tool will follow your instructions and create code files according to your requirements.
Here is a guide how you can use the cursor to create a data analtics project at the end without writing a line of code.
Step 1: Installation and Setup of the cursor
Let’s see how we can use cursor AI for data analytics.
To install the cursor, just go to www.cursor.com, download the version that is compatible with your OS, follow the installation instructions, and you will be configured in seconds.
Here looks like a cursor interface:

Cursor AI interface
To follow this tutorial, download train.csv
File on Kagal with emotion analysis datastas.
Then create a folder called “Emotion Analysis Project” and move the downloaded train CSV file into it.
Finally, create an empty file named app.py
. Your project folder should now look like:

Emotional Analysis Project Folder
This will be our working directory.
Now, navigate this folder on the file -> open the folder and open it in the cursor.
On the right of the screen is a chat interface where you can type indicators in the cursor. Consider that there are some choices. Let’s select the “Agent” in the dropdown.
It tells the cursor to find your code base and work as an AI assistant that will reflect and debug your code.
In addition, you can choose which language model you want to use with cursor (GPT -4O, Gemini -2.5 -Pro, etc.). I recommend that Claude -4 -Sonite use a model that is known for its modern coding abilities.
Step 2: To indicate the cursor to make an application
Let’s now type this prompt in the cursor, and say to create a model of the episode analysis from the end using the training dataset in our Code Base:
Create a sentiment analysis web app that:
1. Uses a pre-trained DistilBERT model to analyze the sentiment of text (positive, negative, or neutral)
2. Has a simple web interface where users can enter text and see results
3. Shows the sentiment result with appropriate colors (green for positive, red for negative)
4. Runs immediately without needing any training
Please connect all the files properly so that when I enter text and click analyze, it shows me the sentiment result right away.
After inserting this indicator into the cursor, it will automatically produce code files to make an emotion analysis request.
Step 3: Accepting changes and running a command
Since the cursor creates new files and produces the code, you need to click “Accept” to confirm the changes made by the AI ​​agent.
After writing all the cursor, it can indicate you to run some command on the terminal. Putting these commands into practice will allow you to install the desired dependence and run a web application.
Just click “run”, which allows the cursor to run this command for us:

Run the command cursor
Once the cursor has developed the application, it will tell you to copy and paste this link in your browser:

Cursor app link
Doing so will attract you feelings analysis web application, which seems to be:

Emotional analysis app with cursor
This is a completely web application with which employers can communicate. You can paste any phrase into this app and the result will predict the feelings of returning you.
If you are early in the field and want to prepare your plans, tools like cursor find me to be incredibly powerful.
Most data do not know front and programming languages ​​like Professional HTML and CSS, which we are unable to show our plans in the interactive application.
Our code is often sitting in the Kagal notebook, which does not give more competitive advantage of hundreds of other applicants who do exactly what works.
Tools like a cursor, however, can separate you from the competition. This can help you change your thoughts into a reality that you tell him.
AI Toll 2: Pandas AI
Pandas AI allows you to manipulate and analyze Pandas data frames without any code.
You just have to type indicators in simple English, which reduces the complexity that comes with the data pre -processing and performing the EDA.
If you do not already know, Pandas is a library that you can use for data analysis and manipulation.
You read data in something known as Pandas Data Frame, which allows you to perform operations on your data.
Let’s go through an example of how you can pre -processing, manipulating and analyzing data with Pandas AI.
This demo’s l i, i will use Titanic Survival Prediction Datasit On the kagal (download train.csv
File)..
For this analysis, I suggest that a notebook environment, such as Jupiter notebook, Kagal Notebook, or Google Kolab. Can be found in the full code for this analysis This kagal notebook.
Step 1: Pandas AI installation and setup
Once you develop your notebook environment, type the command below to install Pandas AI:
!pip install pandasai
Next, load titanic data frame with the following lines of code:
import pandas as pd
train_data = pd.read_csv('/kaggle/input/titanic/train.csv')
Now we import the following libraries:
import os
from pandasai import SmartDataframe
from pandasai.llm.openai import OpenAI
Next, we have to create a Pandas AI Object to analyze the Titanic Train Datastate.
This means:
Pandas AI is a library that connects your Pandas data frame with a large language model. You can use Pandas AI to connect GPT -4O, Claude -3.5, and other LLMs.
As default, Pandas AI uses a language model called bamboo LLM. Pandas AI to connect with the language model L you, you can visit This website To get an API key.
Then, enter the API key in this block of this code to make Pandas AI Object:
# Set the PandasAI API key
# By default, unless you choose a different LLM, it will use BambooLLM.
# You can get your free API key by signing up at
os.environ('PANDASAI_API_KEY') = 'your-pandasai-api-key' # Replace with your actual key
# Create SmartDataframe with default LLM (Bamboo)
smart_df = SmartDataframe(train_data)
Personally, I had some problems in recovering the bamboo LLM API key. Because of this, I decided to get an API key from Open instead. After that, I used the GPT -4 and model for this analysis.
One of the warnings of this approach is that the API keys of Openi are not free. To use these models, you will have to buy an Openi API token.
To do this, Navigate to open AI’s website and buy tokens Billings Page. Then you can go “API Keys” Make a page and your API key.
Now that you have an Openi API key, you need to enter this block of code to connect the GPT -4O model to Pandas AI:
# Set your OpenAI API key
os.environ("OPENAI_API_KEY") = "YOUR_API_KEY"
# Initialize OpenAI LLM
llm = OpenAI(api_token=os.environ("OPENAI_API_KEY"), model="gpt-4o")
config = {
"llm": llm,
"enable_cache": False,
"verbose": False,
"save_logs": True
}
# Create SmartDataframe with explicit configuration
smart_df = SmartDataframe(train_data, config=config)
We can now use this Pandas AI Object to analyze Titanic Dataset.
Step 2: Pre -processing with EDA and Data Pandas AI
First, let’s start with a simple gesture to ask Pandas AI to describe this dataset:
smart_df.chat("Can you describe this dataset and provide a summary, format the output as a table.")
You will see a result that looks like a detailed data summary.

Titanic Datasit Description
Usually we will write some codes to get such a summary. However, with Pandas AI, we only need to write a hint.
If you are initial, it will save you a ton of time that wants to analyze some data but don’t know how to write the code.
Next, let’s analyze some research data with Pandas AI:
I am asking me to give me a relationship with the “surviving” variable in Titanic Dataset as well as some other variables in the datastas.
smart_df.chat("Are there correlations between Survived and the following variables: Age, Sex, Ticket Fare. Format this output as a table.")
The aforementioned prompt should provide you with a coordination between “survivors” and other variables in the datastas.
Next, let’s ask Pandas A to help us see the relationship between these variables:
1. Survived alive and age
smart_df.chat("Can you visualize the relationship between the Survived and Age columns?")
The aforementioned gesture should give you a histogram that looks like:

Titanic datasate age distribution
This visual tells us that small passengers were more likely to escape the accident.
2. Survived alive and gender
smart_df.chat("Can you visualize the relationship between the Survived and Sex")
You should get a chart once to show the relationship between “survivor” and “gender”.
3. Saved and rent
smart_df.chat("Can you visualize the relationship between the Survived and Fare")
The aforementioned indicator presented a box plot, told me that passengers have paid a higher price in fare prices, which is more likely to avoid the Titanic accident.
Note that LLMs are non -biased, which means that the production you get can be different from me. However, you will still get an answer that will help you better understand the data.
Next, we can perform some data pre -processing with such indicators:
Instant instance 1
smart_df.chat("Analyze the quality of this dataset. Identify missing values, outliers, and potential data issues that would need to be addressed before we build a model to predict survival.")
Instant instance 2
smart_df.chat("Let's drop the cabin column from the dataframe as it has too many missing values.")
Instant instance 3
smart_df.chat("Let's impute the Age column with the median value.")
If you want to undergo all pre -processing stages used to clear this dataset with Pandas AI, you can find full indications and code in me Kagal Notebook.
In less than 5 minutes, I was able to handle the data by handling the lost values, encoded category variables, and creating new features. This was done without writing a more coded code, which is especially helpful if you are new to programming.
How to learn AI for data analytics: Next steps
In my opinion, the main sales point of tools such as cursor and Pandas AI is that they allow you to analyze data and edit the code in your programming interface.
This chat is much better than copying and paste the code from your programming IDE in an interface like GPT.
In addition, as your code base increases (that is, if you have thousands of lines codes and more than 10 datases), it is incredibly useful to have an integrated AI tool that has all the context and can understand the relationship between these code files.
If you want to learn AI for data analytics, there are some other tools that I have been helpful.
- Gut Hub: This tool is similar to the cursor. You can use the code tips in your programming IDE, and also has a chat interface with which you can communicate.
- Microsoft in Excel: This AI tool helps you automatically analyze data in your spreadsheet.
- In Excel: This is an extension that allows you to run the codes inside Excel. Although this is not an AI tool, I consider it incredibly useful because it allows you to centralize your data analysis without switching between different applications.
Natasa Selorj Is a self -educated data scientist with a passion for writing. Natasa writes on everything related to science, which is a real master of all data titles. You can contact with it Linked Or check it out Utube channel.