The art of Javascript Generators

Introduction

JavaScript Generators are a powerful feature that can give you the ability to create functions that can be paused and then resumed. Besides handling asynchronous operations, they provide an elegant way to create iterables. In this post, we will learn about generators and use it with real life examples suitable for frontend ( React) as well as backend ( Node. js) development.

What are Generators

JavaScript generators are a special type of function that can be paused and resumed, making them useful for handling complex tasks. Unlike normal functions, which run until completion, generators allow you to control when and how much of the function executes at a time. They are created using the function* syntax.

The key feature of generators is the yield keyword, which is used to return a value and pause the function’s execution. When you call the generator again, it resumes from where it left off, continuing until the next yield statement or the end of the function.

Generators also work as iterators, meaning they can be used in loops or with .next() calls to retrieve values step by step. Each time you invoke .next(), it provides the next value and pauses again.

A common use case for generators is managing large datasets or infinite sequences, as they allow you to process data incrementally, which saves memory. Another popular use is in asynchronous programming, where generators can make code flow more readable by pausing at certain steps.

Here's a quick example of a generator:

1function* numberGenerator() { 2 yield 1; 3 yield 2; 4 yield 3; 5} 6 7const gen = numberGenerator(); 8console.log(gen.next().value); // 1 9console.log(gen.next().value); // 2 10console.log(gen.next().value); // 3

generators-js

Frontend Use Case: Infinite Scrolling in React

One practical application of generators in frontend development is implementing infinite scrolling. Let's create a React component that uses a generator to fetch data in chunks as the user scrolls
1import React, { useState, useEffect } from "react"; 2 3function* dataFetcher() { 4 let page = 1; 5 while (true) { 6 yield fetch(`https://api.example.com/data?page=${page}`).then((response) => 7 response.json() 8 ); 9 page++; 10 } 11} 12 13function InfiniteScrollList() { 14 const [items, setItems] = useState([]); 15 const [fetcher, setFetcher] = useState(null); 16 17 useEffect(() => { 18 setFetcher(dataFetcher()); 19 }, []); 20 21 const loadMore = async () => { 22 if (fetcher) { 23 const { value } = await fetcher.next(); 24 const newItems = await value; 25 setItems((prevItems) => [...prevItems, ...newItems]); 26 } 27 }; 28 29 return ( 30 <div> 31 <ul> 32 {items.map((item) => ( 33 <li key={item.id}>{item.name}</li> 34 ))} 35 </ul> 36 <button onClick={loadMore}>Load More</button> 37 </div> 38 ); 39} 40 41export default InfiniteScrollList;
In this example, the dataFetcher generator yields promises that fetch data from an API. The React component uses this generator to load more items when the user clicks a button. You could easily extend this to trigger on scroll events for true infinite scrolling.

Backend Use Case: Streaming Large Datasets with Node.js

Generators are incredibly useful in backend development, especially when dealing with large datasets. Let's create a Node.js script that uses a generator to stream a large dataset from a database and process it in chunks.
1const { MongoClient } = require("mongodb"); 2 3async function* largeDatasetStreamer(collection, batchSize = 1000) { 4 let skip = 0; 5 while (true) { 6 const batch = await collection.find().skip(skip).limit(batchSize).toArray(); 7 if (batch.length === 0) break; 8 yield batch; 9 skip += batchSize; 10 } 11} 12 13async function processLargeDataset() { 14 const client = new MongoClient("mongodb://localhost:27017"); 15 await client.connect(); 16 17 const db = client.db("mydatabase"); 18 const collection = db.collection("largedata"); 19 20 const streamer = largeDatasetStreamer(collection); 21 22 for await (const batch of streamer) { 23 for (const item of batch) { 24 // Process each item 25 console.log(item); 26 } 27 } 28 29 await client.close(); 30} 31 32processLargeDataset().catch(console.error);
In this Node.js example, we create a generator function largeDatasetStreamer that yields batches of data from a MongoDB collection. The processLargeDataset function uses this generator to iterate over the entire dataset in manageable chunks, allowing for efficient processing of large amounts of data without loading everything into memory at once.

Conclusion

JavaScript generators offer a unique and powerful way to handle asynchronous operations and manage complex flows in both frontend and backend development. Whether you're implementing infinite scrolling in a React application or processing large datasets in Node.js, generators can help you write more efficient and maintainable code. By leveraging generators, you can create more responsive user interfaces and build scalable backend systems that can handle large amounts of data with ease. As you've seen in the examples above, generators provide a clean and intuitive API for working with asynchronous and iterative processes, making them a valuable tool in any JavaScript developer's toolkit.