Skip to main content
AI in Production 2026 is now open for talk proposals.
Share insights that help teams build, scale, and maintain stronger AI systems.
items
Menu
  • About
    • Overview 
    • Join Us  
    • Community 
    • Contact 
  • Training
    • Overview 
    • Course Catalogue 
    • Public Courses 
  • Posit
    • Overview 
    • License Resale 
    • Managed Services 
    • Health Check 
  • Data Science
    • Overview 
    • Visualisation & Dashboards 
    • Open-source Data Science 
    • Data Science as a Service 
    • Gallery 
  • Engineering
    • Overview 
    • Cloud Solutions 
    • Enterprise Applications 
  • Our Work
    • Blog 
    • Case Studies 
    • R Package Validation 
    • diffify  

Using Google Lighthouse for Web Pages

Author: Osheen MacOscar

Published: November 30, 2023

tags: r, shiny, lighthouse

This is part one of a three part series on Lighthouse for Shiny Apps.

  • Part 1: Using Google Lighthouse for Web Pages (This post)
  • Part 2: Analysing Shiny App start-up Times with Google Lighthouse
  • Part 3: Effect of Shiny Widgets with Google Lighthouse

Intro

This blog post was partly inspired by Colin Fay’s talk “Destroy All Widgets” at our “Shiny In Production” conference in 2022. In that talk, Colin spoke about HTML widgets and highlighted how detrimental they can be to the speed of a Shiny app. Speaking of which, the next Shiny In Production conference is taking place on 9th and 10th of October 2024, and recordings for this year’s events are coming soon to our YouTube channel.

Join us for the next installment of our Shiny in Production conference! For more details, check out our conference website!

I wanted to see if I could measure the speed of a collection of shiny apps. To do so, I was directed to Google Lighthouse, and this blog is dedicated to the use and understanding of lighthouse before I start using it on Shiny Apps.

Google Lighthouse

Google Lighthouse is an open source tool which can be used to test webpages (or web hosted apps like Shiny apps). For a specified webpage, Lighthouse generates a report summarising several aspects of that webpage. For Shiny, the most important aspects are summarised in the “Overall Performance Score” and the “Accessibility Score”, with one of the best parts being the feedback given by the report on how you can improve.

Before you can use Lighthouse you must install it (and npm if you don’t already have it):

npm install -g lighthouse

Then to run a Google Lighthouse assessment in the command line you simply run:

lighthouse --output json --output-path data/output_file.json url

Where you specify:

  • the output format, either json and csv are available, I used json as more information is stored.
  • The output path for where you would like the data to be stored.
  • The url of the Shiny app you would like to test (the location of your deployed app or, if developing locally, the URL that Shiny prints out when the app starts: Listening on http://127.0.0.1:4780).

One cool feature of Lighthouse is that you can test apps in both desktop and mobile settings. The default is mobile but you can specify desktop by adding --preset desktop after the url argument.

When you run the command a new Chrome browser will open with the specified URL, where Lighthouse will run the report. This browser will automatically be closed by Lighthouse when it is finished. For all the Lighthouse demos in this blog I am going to use our website for consistency.

Another way to access Lighthouse is to simply use it in a Chrome browser and open the DevTools panel, as described in the Chrome Developer documentation. A Lighthouse tab should be visible in the “more tabs” section, where you can run performance checks interactively.

Lighthouse in browser. The image is a screenshot of the dev tools panel with the title 'Generate a Lighthouse report' and a button to 'Analyze page load', followed by some radio buttons to select different options, including whether you want to test for mobile or desktop devices - mobile is selected by default.

From DevTools all you do is tick the boxes to specify the device type and performance metrics you want to assess. Then press “Analyze page load” to start the Lighthouse report generation.

Lighthouse Output

Depending on how you’ve run the Lighthouse report, the way you access the results will be different. Firstly if you have used the terminal and saved the lighthouse output you will have a csv or json file containing the data displayed in the report (json output contains more in depth data).

Alternatively from the terminal you can add --view after the URL and the Lighthouse report will open in your browser to view it when ready. Here is an example of this:

Lighthouse report in browser. The page shows four metrics at the top and their scores out of 100. Performance: 100, Accessibility: 100, Best Practices: 92, SEO: 100, PWA: No score shown. This is followed by a list of detailed metrics and a view of the page in question.

Lastly, if you have run Lighthouse through DevTools in a Chrome browser, the report will become visible in the DevTools panel. Location aside, the report should look identical to the browser version created with the --view option. It should look similar to this:

Lighthouse report in browser. The same report as above is shown in the dev tools window split with the page in question. This time the metrics are as follows: Performance: 100, Accessibility: 96, Best Practices: 100, SEO: 100, PWA: No score shown.

You may have noticed that I have got different scores in the separate screenshots even though I am using the same URL for both. This gives me a great opportunity to bring up one of the drawbacks of Lighthouse, and that is the variability in results. For example you could run a test on our website and get a different score. There are a number of reasons for this including internet or device performance and browser extensions, so the Lighthouse developers recommend running multiple tests. This topic is covered in more detail here.

Lighthouse Performance Metrics

Lighthouse scores apps on 5 measures: Performance, Accessibility, Best Practices, SEO (search engine optimization) and PWA (progressive web app).

Here, we will look at the overall performance score. This is based on a weighted combination of several different metrics. As of Lighthouse 10 (8 was slightly different) the score is made up of:

  • 10% First Contentful Paint - This is the time from the page starting to any part of the page’s content is rendered on the screen. “Content” can be text, images, <svg> elements or non-white <canvas> elements.
  • 10% Speed Index - This is how quickly the contents of a page are visibly populated.
  • 25% Largest Contentful Paint - This metric is the time between the page starting and the largest visible image or text block loading.
  • 30% Total Blocking Time - This is the time between first contentful paint and another metric called time to interactive, which measures how long the app takes to become interactive for the user.
  • 25% Cumulative Layout Shift - This is measure of the largest layout shift which occurs during the lifespan of a page, a good explanation can be found here.

Performance scores lie in a range between 0 (worst) and 100 (best).

Lighthouse Performance Suggestions

Another cool feature of Google Lighthouse is the performance improvement suggestions. I am going to use the Surfline website as an example for this section. These suggestions can be found underneath the performance score on the report and should look similar to the image below.

A report page as in the above images is shown, this time for a different webpage. The scores here are as follows: 74, 81, 83, 100. Below is a section called Opportunities, which lists ways in which the score can be improved, and estimated loading time savings.

For each suggestion you have the ability to expand for more information along with the visible estimated time savings from implementing the suggestion. These suggestions can be helpful if you want to improve a particular aspect of your website or just generally streamline it.

This was an overview of Google Lighthouse covering the many ways to run reports on web pages and some guidelines for interpreting Lighthouse reports. We can also use it to analyse Shiny applications, which will be covered in the next installment of this blog series.


Jumping Rivers Logo

Recent Posts

  • Start 2026 Ahead of the Curve: Boost Your Career with Jumping Rivers Training 
  • Should I Use Figma Design for Dashboard Prototyping? 
  • Announcing AI in Production 2026: A New Conference for AI and ML Practitioners 
  • Elevate Your Skills and Boost Your Career – Free Jumping Rivers Webinar on 20th November! 
  • Get Involved in the Data Science Community at our Free Meetups 
  • Polars and Pandas - Working with the Data-Frame 
  • Highlights from Shiny in Production (2025) 
  • Elevate Your Data Skills with Jumping Rivers Training 
  • Creating a Python Package with Poetry for Beginners Part2 
  • What's new for Python in 2025? 

Top Tags

  • R (236) 
  • Rbloggers (182) 
  • Pybloggers (89) 
  • Python (89) 
  • Shiny (63) 
  • Events (26) 
  • Training (23) 
  • Machine Learning (22) 
  • Conferences (20) 
  • Tidyverse (17) 
  • Statistics (14) 
  • Packages (13) 

Authors

  • Amieroh Abrahams 
  • Aida Gjoka 
  • Gigi Kenneth 
  • Osheen MacOscar 
  • Shane Halloran 
  • Russ Hyde 
  • Myles Mitchell 
  • Tim Brock 
  • Sebastian Mellor 
  • Keith Newman 
  • Theo Roe 
  • Pedro Silva 
  • Colin Gillespie 

Keep Updated

Like data science? R? Python? Stan? Then you’ll love the Jumping Rivers newsletter. The perks of being part of the Jumping Rivers family are:

  • Be the first to know about our latest courses and conferences.
  • Get discounts on the latest courses.
  • Read news on the latest techniques with the Jumping Rivers blog.

We keep your data secure and will never share your details. By subscribing, you agree to our privacy policy.

Follow Us

  • GitHub
  • Bluesky
  • LinkedIn
  • YouTube
  • Eventbrite

Find Us

The Catalyst Newcastle Helix Newcastle, NE4 5TG
Get directions

Contact Us

  • hello@jumpingrivers.com
  • + 44(0) 191 432 4340

Newsletter

Sign up

Events

  • North East Data Scientists Meetup
  • Leeds Data Science Meetup
  • Shiny in Production
British Assessment Bureau, UKAS Certified logo for ISO 9001 - Quality management British Assessment Bureau, UKAS Certified logo for ISO 27001 - Information security management Cyber Essentials Certified Plus badge
  • Privacy Notice
  • |
  • Booking Terms

©2016 - present. Jumping Rivers Ltd