React Lazy Loading

Introduction

It's a new year and apparently I'm already late to the party - as per usual.

According to Google, the Suspense component was first introduced in version 16.6 of React way back in 2018. It was added to the stable React 18 release in 2022. But hey, if this information was new to me, maybe it will be new to you too. Or if you're already familiar with React.lazy and Suspense, feel free to skip this post.

I've done lazy loading in other frameworks like Angular, but as embarrassing as it is to admit, I hadn't really done lazy loading in React until just last week. In my defense, my client-side React web apps are usually already quite fast. That being said, React makes it so easy to implement lazy loading I figured I might as well take advantage of it.

Suspense seems to be commonly used in Next.js codebases, but in this post I'm just going to show how I implemented lazy loading in a frontend codebase that uses client-side routing (using React 19.1.X as of this writing) instead of SSR - SEO be damned.

Implementation

The following before-and-after frontend code snippets illustrate the simplicity of implementing lazy loading in React.

Before:

router.tsx (No Lazy Loading)
// No lazy loading.
// The client side AboutPage component code
// will be included in the main JavaScript bundle.

// Other imports omitted.
import { createBrowserRouter } from 'react-router-dom'
import AboutPage from './pages/AboutPage'

export const router = createBrowserRouter([
  {
    // Other attrs omitted.
    children: [
      // Other elements omitted.
      {
        path: 'about',
        element: <AboutPage />
      },
    ],
  },
])

After:

router.tsx (With Lazy Loading)
// With lazy loading.
// The client side AboutPage component code
// will only be loaded if the user navigates
// to the /about route. A loading message will
// be displayed while the page is being loaded.

// Other imports omitted.
import { Suspense, lazy } from 'react';
import { createBrowserRouter } from 'react-router-dom'
const AboutPage = lazy(() => import('./pages/AboutPage'));

export const router = createBrowserRouter([
  {
    // Other attrs omitted.
    children: [
      // Other elements omitted.
      {
        path: 'about',
        element: <Suspense fallback={<div>Loading...</div>}>
          <AboutPage />
        </Suspense>
      },
    ],
  },
])

In summary, we import lazy and Suspense from react, we use lazy to import the component code, then in the router config we wrap the element component with Suspense and provide some fallback JSX that will be displayed until the lazy loaded component is ready to be rendered. Easy peasy.

Conclusion

One of the main aspects of frontend performance optimization is minimizing the number of bytes sent over the network. Data after all is physical. We call it things like "digital" and "virtual" but data is very much physical. It comes in all shapes and sizes.

Internet and network connections are also physical. Sometimes data travels through a highspeed fiber optic cable, sometimes it's through a copper wire, sometimes it's through a Wifi connection, sometime it's through an old 3G mobile connection, you get the idea.

It takes light - the fastest thing in the universe - 16 milliseconds to travel from New York to Los Angeles. Data does not move at the speed of light. It takes an estimated 40-70 milliseconds for a data packet to travel from New York to Los Angeles.

TTFB (or Time to First Byte) measures how long it takes for a web browser to receive the first byte of data from the server. You can see the benefit of locating your servers geographically close to your users (usually via a CDN) if your goal is a sub-50 millisecond TTFB.

A sub 50 millisecond TTFB wouldn't be feasible if your users are located in Boston but your servers are located in San Diego (an even further distance than New York to Los Angeles). That 40-70 millisecond estimate only accounts for the data transfer, it doesn't even account for whatever latency your server application code adds to the request.

Heck, you could have a slow database query that takes hundreds of milliseconds, but since we're talking about targeting a sub 50 millisecond TTFB, let's assume the server code is highly optimized.

The laws of physics are what they are. As web developers we have techniques like minification, lazy loading, streaming, and compression to help minimize the number of bytes that get sent over the network. These techniques combined with geographically locating our servers close to our end-users are the best ways we have to cope with the performance limitations imposed by the universe.

Comments

There are no public comments at this time.


Post a Comment

By submitting this comment, you agree to our Privacy Policy