← Back to Index

Chapter 4: IO Operations

File system operations are the foundation of server-side development

1. fs/promises Module

Node.js's fs/promises module provides Promise-based file system APIs and is the preferred choice for modern TypeScript projects.

import { readFile, writeFile, mkdir, readdir, stat, unlink } from "fs/promises";

// 读取文本文件
const content = await readFile("./config.json", "utf-8");
console.log(content);

// 写入文件(不存在则创建,存在则覆盖)
await writeFile("./output.txt", "Hello, TypeScript!", "utf-8");

// 创建Index(recursive 类似 mkdir -p)
await mkdir("./logs/2024", { recursive: true });

// 读取Index内容
const files = await readdir("./src");
console.log(files); // ["index.ts", "utils.ts", ...]

// 获取文件信息
const info = await stat("./package.json");
console.log(info.size, info.isFile(), info.isDirectory());

// 删除文件
await unlink("./temp.txt");
🔄 Comparison with Python: readFile is similar to open().read(), readdir is similar to os.listdir(), stat is similar to os.stat().

2. Sync vs Async

Node.js's fs module provides both sync and async versions. In server scenarios, always prefer async APIs.

import { readFileSync } from "fs";
import { readFile } from "fs/promises";

// 同步读取 —— 会阻塞事件循环
const data = readFileSync("./config.json", "utf-8");

// 异步读取 —— 不阻塞
const dataAsync = await readFile("./config.json", "utf-8");

When to use sync API?

  • CLI scripts or one-off tools
  • Loading config files during application startup
  • Helper operations in test code

⚠️ Never use sync APIs in HTTP server request handlers — it blocks all other requests.

3. Stream Read/Write

When processing large files, Streams avoid loading the entire file into memory. This is consistent with Java's InputStream/OutputStream philosophy.

import { createReadStream, createWriteStream } from "fs";
import { pipeline } from "stream/promises";

// 流式复制大文件
async function copyFile(src: string, dest: string): Promise<void> {
    const readStream = createReadStream(src);
    const writeStream = createWriteStream(dest);
    await pipeline(readStream, writeStream);
    console.log("复制完成");
}

// 逐块读取处理
const stream = createReadStream("./huge-log.txt", { encoding: "utf-8" });
let lineCount = 0;
for await (const chunk of stream) {
    lineCount += (chunk as string).split("\n").length;
}
console.log(`总行数: ${lineCount}`);

When to use Streams?

  • Files larger than 100MB
  • Need to process data line-by-line or chunk-by-chunk
  • Network transfer (sending files in HTTP responses)

4. path Module

Never manually concatenate path strings. The path module correctly handles path separators across different operating systems.

import path from "path";
import { fileURLToPath } from "url";

// 常用方法
path.join("/usr", "local", "bin");      // "/usr/local/bin"
path.resolve("src", "index.ts");         // 返回绝对路径
path.dirname("/app/src/index.ts");       // "/app/src"
path.basename("/app/src/index.ts");      // "index.ts"
path.extname("report.pdf");             // ".pdf"
path.parse("/app/src/index.ts");
// { root: "/", dir: "/app/src", base: "index.ts", ext: ".ts", name: "index" }

// ESM 模块中获取 __dirname(ESM 没有 __dirname 全局变量)
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);

// 实际用法:读取同Index下的配置文件
const configPath = path.join(__dirname, "config.json");

5. JSON File Operations

JSON is the most commonly used data exchange format. TypeScript can leverage its type system for type safety.

import { readFile, writeFile } from "fs/promises";

// 定义接口
interface AppConfig {
    port: number;
    host: string;
    debug: boolean;
}

// 读取并解析 JSON
async function loadConfig(filePath: string): Promise<AppConfig> {
    const raw = await readFile(filePath, "utf-8");
    return JSON.parse(raw) as AppConfig;
}

// 序列化并写入 JSON
async function saveConfig(filePath: string, config: AppConfig): Promise<void> {
    const json = JSON.stringify(config, null, 2);
    await writeFile(filePath, json, "utf-8");
}

Use zod for runtime validation to ensure external data truly matches expected types:

import { z } from "zod";

const ConfigSchema = z.object({
    port: z.number().int().min(1).max(65535),
    host: z.string(),
    debug: z.boolean(),
});

type Config = z.infer<typeof ConfigSchema>;

const raw = JSON.parse(await readFile("config.json", "utf-8"));
const config = ConfigSchema.parse(raw); // 验证失败会抛出 ZodError

6. CSV Processing

For CSV file processing, csv-parse and csv-stringify libraries are recommended.

npm install csv-parse csv-stringify
import { parse } from "csv-parse/sync";
import { readFileSync } from "fs";

interface SalesRecord {
    date: string;
    product: string;
    amount: number;
}

const csvContent = readFileSync("./sales.csv", "utf-8");
const records = parse(csvContent, {
    columns: true,       // 首行作为列名
    skip_empty_lines: true,
    cast: (value, context) => {
        if (context.column === "amount") return Number(value);
        return value;
    },
}) as SalesRecord[];

records.forEach((r) => console.log(`${r.date}: ${r.product} - ¥${r.amount}`));

📝 Chapter Summary

Prefer fs/promises

Promise-based async API is the modern standard

Use Streams for large files

Avoid memory overflow, use pipeline() for piping

Use path module for paths

Cross-platform compatible, use import.meta.url for ESM

JSON parsing with type guards

Use zod for safer runtime validation