Skip to main content
AI in Production 2026 is now open for talk proposals.
Share insights that help teams build, scale, and maintain stronger AI systems.
items
Menu
  • About
    • Overview 
    • Join Us  
    • Community 
    • Contact 
  • Training
    • Overview 
    • Course Catalogue 
    • Public Courses 
  • Posit
    • Overview 
    • License Resale 
    • Managed Services 
    • Health Check 
  • Data Science
    • Overview 
    • Visualisation & Dashboards 
    • Open-source Data Science 
    • Data Science as a Service 
    • Gallery 
  • Engineering
    • Overview 
    • Cloud Solutions 
    • Enterprise Applications 
  • Our Work
    • Blog 
    • Case Studies 
    • R Package Validation 
    • diffify  

R as a tool for Systems Administration

Published: January 27, 2020

tags: r, packages, cloud

When talking about languages to use in Production in data science, R is usually not part of the conversation and if it is, it’s referenced as a secondary language. One of the main reasons this occurs is because R it’s commonly associated with being more suitable for statistical analysis and languages like Python and JavaScript, more suitable for doing other tasks such as creating web applications or implementing machine learning models. However, one realm where R’s capabilities haven’t been explored to the maximum is Systems Administration.

At Jumping Rivers we make use of R as our main tool for doing tasks related to Systems Administration. The main way in which we implement our solutions is by dividing one package per service and then developing the specific functions to manage it. One of these packages that we have developed is named {jrDroplet}.

Do you use Professional Posit Products? If so, check out our managed Posit services

{jrDroplet}

{jrDroplet} is a package designed specifically to manage Virtual Machines in Digital Ocean for our training courses. The idea is that with a single line we are able to create a Digital Ocean droplet with the packages installed for our courses, hiding all of the background complexities related to infrastructure. Below is an overview of our create_droplet() function, reduced slightly for simplicity:

create_droplet = function(client_name,
                           droplet_name,
                           vm_size,
                           ssh_keys,
                           image_base,
                           region,
                           sub_domain,
                           dns_root)
  {

  image = get_latest_training_snapshot(region = region,
                                       base = image_base)[[1]]

  message(paste0("Using image ", image$name))

  analogsea::droplet_create(
                            name = droplet_name,
                            region = region,
                            ssh_keys = ssh_keys,
                            size = vm_size,
                            image = image$id)

  droplets = analogsea::droplets()

  message('Waiting for IP address to be assigned to VM')

  ip_address = droplets[[droplet_name]]$networks$v4[[1]]$ip_address

  dr = analogsea::domain_record_create(
    domain = dns_root,
    type = 'A',
    name = sub_domain,
    data = ip_address
  )
}

This function takes a set of given arguments and proceeds to do a number of steps that would be required to be done manually in the Digital Ocean interface. I will explain below what is happening in the code and what would be the equivalent in the interface.

 image = get_latest_training_snapshot(region = region,
                                       base = image_base)[[1]]

In this code chunk we are obtaining the latest snapshot created in the Jumping River’s Digital Ocean organization, searching by base image, meaning searching if the R image or the Python image. These base images are built using a tool named Packer, however implementation details of this process will come in a future post. The equivalent of this code chunk would be when creating a droplet, to select the Snapshots tab and manually pick the training snapshot.

analogsea::droplet_create(
                            name = droplet_name,
                            region = region,
                            ssh_keys = ssh_keys,
                            size = vm_size,
                            image = image$id)

In this code chunk we are using the package analogsea which is the backbone of our {jrDroplet} package. {analogsea} is a package to manage Digital Ocean infrastructure through the API and following Open Source principles, we are building on it for our specific use case. In this case, we are using the droplet_create() function to create the Droplet for our training VM with the desired parameters, and based on the latest training snapshot.

 droplets = analogsea::droplets()

  ip_address = droplets[[droplet_name]]$networks$v4[[1]]$ip_address

  dr = analogsea::domain_record_create(
    domain = dns_root,
    type = 'A',
    name = sub_domain,
    data = ip_address
  )

This is the final code chunk we are going to discuss in this post. What we are doing here is first listing all of the available droplets to then search for the IP address of the droplet created. We need this IP address for the function domain_record_create(). Very briefly, a Domain Record is a record connecting a specific name to a specific IP address, and they are stored in Domain Name Services. So in this command, we are taking the IP address and using our base DNS root name to create a new subdomain specifically for this new droplet. If we were to do this through the DO interface we would need to go to the Networking Tab and select things from Dropdown menus.

This is just one example of how we use R as a tool for Systems Administration. Another tool we have created is called monitR and it’s a package to monitor the full stack of services that might be offered to a specific client. This tool has the back-end functions to manage the data, building upon existing system administration tools and frameworks. It also has a Shiny dashboard that allows us to visualize all the data for our clients. In conclusion, R has many uses aside from the classical statistical analyses and shouldn’t be limited as a language solely for Data Scientists.


Jumping Rivers Logo

Recent Posts

  • Start 2026 Ahead of the Curve: Boost Your Career with Jumping Rivers Training 
  • Should I Use Figma Design for Dashboard Prototyping? 
  • Announcing AI in Production 2026: A New Conference for AI and ML Practitioners 
  • Elevate Your Skills and Boost Your Career – Free Jumping Rivers Webinar on 20th November! 
  • Get Involved in the Data Science Community at our Free Meetups 
  • Polars and Pandas - Working with the Data-Frame 
  • Highlights from Shiny in Production (2025) 
  • Elevate Your Data Skills with Jumping Rivers Training 
  • Creating a Python Package with Poetry for Beginners Part2 
  • What's new for Python in 2025? 

Top Tags

  • R (236) 
  • Rbloggers (182) 
  • Pybloggers (89) 
  • Python (89) 
  • Shiny (63) 
  • Events (26) 
  • Training (23) 
  • Machine Learning (22) 
  • Conferences (20) 
  • Tidyverse (17) 
  • Statistics (14) 
  • Packages (13) 

Authors

  • Amieroh Abrahams 
  • Colin Gillespie 
  • Aida Gjoka 
  • Shane Halloran 
  • Gigi Kenneth 
  • Osheen MacOscar 
  • Sebastian Mellor 
  • Keith Newman 
  • Pedro Silva 
  • Tim Brock 
  • Russ Hyde 
  • Myles Mitchell 
  • Theo Roe 

Keep Updated

Like data science? R? Python? Stan? Then you’ll love the Jumping Rivers newsletter. The perks of being part of the Jumping Rivers family are:

  • Be the first to know about our latest courses and conferences.
  • Get discounts on the latest courses.
  • Read news on the latest techniques with the Jumping Rivers blog.

We keep your data secure and will never share your details. By subscribing, you agree to our privacy policy.

Follow Us

  • GitHub
  • Bluesky
  • LinkedIn
  • YouTube
  • Eventbrite

Find Us

The Catalyst Newcastle Helix Newcastle, NE4 5TG
Get directions

Contact Us

  • hello@jumpingrivers.com
  • + 44(0) 191 432 4340

Newsletter

Sign up

Events

  • North East Data Scientists Meetup
  • Leeds Data Science Meetup
  • Shiny in Production
British Assessment Bureau, UKAS Certified logo for ISO 9001 - Quality management British Assessment Bureau, UKAS Certified logo for ISO 27001 - Information security management Cyber Essentials Certified Plus badge
  • Privacy Notice
  • |
  • Booking Terms

©2016 - present. Jumping Rivers Ltd