LLMs in R and Python

R
Python
AI
Tutorial
Author

Nithin M

Published

May 31, 2025

Audience

  • R/Python users
  • some experience with coding
  • used chatGPT/Claude in browser
  • have not used LLMs in code

Getting started

We will use LLMs through HTTP APIs.

We will use the following packages:

R

Setting Up

We will use ellmer and dotenv packages in R to set things up. We use dotenv package to load our API keys.

install.packages(c("ellmer","dotenv"))

First we need to obtain API keys. Since OpenAI and Claude are not free, we will use “Gemini” and “Grok” in our examples

  • obtain API keys and save in a .env file in the working directory as a key-value pair.
GEMINI_API_KEY=your_key_here
GROK_API_KEY=your_key_here
# options(ellmer.model = ellmer::chat_groq())
library(ellmer)
library(dotenv)
options(ellmer.model = ellmer::chat_google_gemini())

Now let us set the model and invoke the chat

# chat_google_gemini(system_prompt = "You are an experienced R Programmer. Give me only tidyverse codes. Also use proper linting and styling when giving codes")
client <- chat_google_gemini()
Using model = "gemini-2.0-flash".

Now that we have invoked the chat, let us prompt something

client$chat("Summarize the plot of Romeo and Juliet in 20 words or less.")
Forbidden love between feuding families leads to tragic deaths of Romeo and 
Juliet, ending the bitter conflict.

We can continue the conversation by calling the chat() again.

client$chat("Now Macbeth")
Driven by ambition and prophecy, Macbeth murders his king, becomes a tyrant, 
and meets a bloody end.

Now we can modify the system prompt and get better results

chat_google_gemini(
  system_prompt = "You are well versed in literature. Give some lucid explanation of the literature in about 100 words"
)
<Chat Google/Gemini/gemini-2.0-flash turns=1 tokens=0/0 $0.00>
── system [0] ──────────────────────────────────────────────────────────────────
You are well versed in literature. Give some lucid explanation of the literature in about 100 words
client <- chat_google_gemini()
client$chat("Summarize the plot of Romeo and Juliet")
Romeo and Juliet is a tragic love story about two young people from feuding 
families, the Montagues and the Capulets, who fall deeply in love.

*   **The Feud:** The play opens with a violent street brawl between the two 
families, highlighting their long-standing animosity.
*   **Love at First Sight:** Romeo, a Montague, attends a Capulet ball and 
instantly falls in love with Juliet, a Capulet. They are unaware of each 
other's family affiliation.
*   **Secret Marriage:** Despite their families' hatred, Romeo and Juliet 
declare their love and secretly marry with the help of Friar Laurence.
*   **Tragedy Strikes:** A fight breaks out between Romeo's friend Mercutio and
Juliet's cousin Tybalt. Romeo tries to stop them, but Tybalt kills Mercutio. 
Enraged, Romeo kills Tybalt and is banished from Verona by the Prince.
*   **Desperate Measures:** Juliet's parents, unaware of her marriage to Romeo,
arrange for her to marry Paris. Desperate to avoid this, Juliet seeks help from
Friar Laurence. He devises a plan for her to take a potion that will make her 
appear dead, so she can avoid the marriage and reunite with Romeo.
*   **Miscommunication:** The message about Juliet's staged death doesn't reach
Romeo in time. Believing Juliet is truly dead, Romeo returns to Verona and goes
to Juliet's tomb.
*   **The Double Suicide:** In the tomb, Romeo kills Paris, who is also there 
to mourn Juliet. Overcome with grief, Romeo drinks poison and dies. When Juliet
awakens and finds Romeo dead, she takes his dagger and kills herself.
*   **Reconciliation:** The families arrive to find their children dead. Friar 
Laurence explains the events that led to the tragedy. Overwhelmed with remorse,
the Montagues and Capulets finally reconcile, ending their feud.

In short, Romeo and Juliet is a story of passionate love thwarted by familial 
hatred, leading to the tragic deaths of both lovers. Their deaths ultimately 
bring peace to their feuding families.

we can also use the interactive window for conversation.

live_browser(client)

Further,we can also save results of our chat.

client <- chat_groq(
  system_prompt = "You are a friendly but terse assistant. You always use markdown syntax for code.",
)
Reply from chat using elmer

Using the map Function in R

The map function is part of the purrr package in R, which provides a composable functional programming pipeline. It applies a function to each element of an input vector or list, returning a list with the same length as the input.

Here’s an example:

library(purrr)

# Create a list of numbers
numbers <- list(a = 1, b = 2, c = 3, d = 4)

# Double each number using map
result <- map(numbers, ~ .x * 2)

# Print the result
result

Output:

$a 
[1] 2

$b 
[1] 4

$c 
[1] 6

$d 
[1] 8

Comparing map with lapply

lapply is a built-in R function that applies a function to each element of a list and returns a list. Here’s an example:

# Double each number using lapply
result <- lapply(numbers, function(x) x * 2)

# Print the result
result

Output:

$a 
[1] 2

$b 
[1] 4

$c 
[1] 6

$d 
[1] 8

As you can see, both map and lapply produce the same result in this example. However, map is generally more concise and expressive, especially when dealing with complex data structures or pipelines.

Key differences:

  • map is part of the purrr package, while lapply is a built-in R function.
  • map is more concise and expressive, making it a popular choice for functional programming in R.
  • lapply is more flexible, as it can handle vectors and other types of input, whereas map is limited to lists and vectors.

Choose map when:

  • You need to perform a simple transformation on each element of a list or vector.
  • You want to leverage the concise syntax and expressiveness of functional programming.

Choose lapply when:

  • You need to perform a more complex operation that involves manipulating the index, such as using lapply with a named list.
  • You prefer a more traditional, R-style syntax for applying a function to a list.

I hope this helps!

Python

In python, we use the chatlas package. But before using the pakage let us install the dotenv package in python.

pip install python-dotenv
pip install -U chatlas

simmilar to ellmer in R, we invoke a chat object first.

from chatlas import ChatGroq
from dotenv import load_dotenv
load_dotenv() 

client = ChatGroq()
client.chat("Summarize the plot of Romeo and Juliet in 20 words or less.")
Two young lovers from feuding families meet, secretly marry, and tragically
die in a misunderstanding.

 

Export chat

Easily get a full markdown or HTML export of a conversation:

chat.export("convo.md", title="Python Q&A")

✨ Wrapping Up

Whether you’re an R aficionado or a Pythonista, incorporating LLMs into your workflow is no longer a stretch. With just a few lines of setup, you’re ready to bring powerful generative models into your daily analysis, documentation, or even code assistance.

The possibilities are wide open. So — what will you ask your LLM next?