File I/O
System.IO vs Node.js fs module — reading, writing, and JSON serialization
Introduction
In this lesson, you'll learn about file i/o in C#. Coming from TypeScript, you already have a foundation for understanding this concept. We'll build on that knowledge while highlighting the key differences.
In TypeScript, you're familiar with system.io vs node.js fs module — reading, writing, and json serialization.
C# has its own approach to system.io vs node.js fs module — reading, writing, and json serialization, which we'll explore step by step.
The C# Way
Let's see how C# handles this concept. Here's a typical example:
using System.IO;
using System.Text.Json;
// Read
string text = File.ReadAllText("data.txt");
string[] lines = File.ReadAllLines("data.txt");
// Write
File.WriteAllText("out.txt", "Hello!\n");
File.AppendAllText("log.txt", $"entry {DateTime.Now}\n");
// JSON
string raw = File.ReadAllText("config.json");
var config = JsonSerializer.Deserialize<Config>(raw,
new JsonSerializerOptions { PropertyNameCaseInsensitive = true });
File.WriteAllText("out.json",
JsonSerializer.Serialize(config, new JsonSerializerOptions { WriteIndented = true }));
// StreamReader (large files) — like Node.js ReadStream
using var reader = new StreamReader("big.csv");
while (!reader.EndOfStream)
{
string? line = reader.ReadLine();
Process(line);
}
// Path
Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "data", "file.txt");
Path.GetExtension("archive.tar.gz"); // ".gz"
// Directory
Directory.CreateDirectory("newdir"); // recursive by default
string[] files = Directory.GetFiles(".", "*", SearchOption.TopDirectoryOnly);Comparing to TypeScript
Here's how you might have written similar code in TypeScript:
import fs from "fs/promises";
import path from "path";
// Read
const text = await fs.readFile("data.txt", "utf-8");
const lines = text.split("\n");
// Write
await fs.writeFile("out.txt", "Hello!\n", "utf-8");
await fs.appendFile("log.txt", `entry ${Date.now()}\n`);
// JSON
const raw = await fs.readFile("config.json", "utf-8");
const config = JSON.parse(raw) as Config;
await fs.writeFile("out.json", JSON.stringify(config, null, 2));
// Stream (large files)
import { createReadStream } from "fs";
const stream = createReadStream("big.csv", "utf-8");
stream.on("data", chunk => process(chunk));
// Path
path.join(__dirname, "data", "file.txt");
path.extname("archive.tar.gz"); // ".gz"
// Directory
await fs.mkdir("newdir", { recursive: true });
const files = await fs.readdir(".");You may be used to different syntax or behavior.
Node.js fs is async by default; C# File class is sync (use async File.ReadAllTextAsync for async)
You may be used to different syntax or behavior.
using statement auto-disposes StreamReader — like Node.js stream.destroy() in finally
You may be used to different syntax or behavior.
System.Text.Json.JsonSerializer vs JSON.parse/stringify — both with indentation options
You may be used to different syntax or behavior.
Path.Combine vs path.join — handles OS path separators automatically
You may be used to different syntax or behavior.
Directory.CreateDirectory is always recursive; fs.mkdir needs { recursive: true }
Step-by-Step Breakdown
1. Simple Read/Write
File.ReadAllText/WriteAllText are the synchronous equivalents of fs.readFileSync/writeFileSync. For async, use the Async variants.
await fs.readFile("f","utf-8")
await fs.writeFile("f","data")// Sync (most common)
File.ReadAllText("f")
File.WriteAllText("f","data")
// Async
await File.ReadAllTextAsync("f")
await File.WriteAllTextAsync("f","data")2. StreamReader with using
using auto-calls Dispose() at block end — equivalent to Node.js try/finally. Always use it with streams.
const stream = createReadStream("f");
try { for await (const chunk of stream) {} }
finally { stream.destroy(); }using var r = new StreamReader("f");
while (!r.EndOfStream) {
string? line = r.ReadLine();
Process(line);
}
// r.Dispose() called automatically3. JSON Serialization
System.Text.Json is the built-in JSON library. Use struct tags (JsonPropertyName) for custom field names.
JSON.parse(text) as Config
JSON.stringify(obj, null, 2)var cfg = JsonSerializer.Deserialize<Config>(text);
string json = JsonSerializer.Serialize(cfg,
new JsonSerializerOptions { WriteIndented = true });4. Path Operations
Path.Combine handles OS separators (/ on Linux, \\ on Windows). GetExtension/GetFileName parse path components.
path.join(dir, "sub", "file.txt")
path.extname("f.txt") // ".txt"Path.Combine(dir, "sub", "file.txt")
Path.GetExtension("f.txt") // ".txt"
Path.GetFileName("dir/f.txt") // "f.txt"
Path.GetDirectoryName("dir/f.txt") // "dir"Common Mistakes
When coming from TypeScript, developers often make these mistakes:
- Node.js fs is async by default; C# File class is sync (use async File.ReadAllTextAsync for async)
- using statement auto-disposes StreamReader — like Node.js stream.destroy() in finally
- System.Text.Json.JsonSerializer vs JSON.parse/stringify — both with indentation options
Key Takeaways
- File.ReadAllText/WriteAllText for simple files; Async variants for async contexts
- using var reader = new StreamReader() auto-disposes — always use for streams
- System.Text.Json.JsonSerializer: Deserialize<T>() and Serialize() with JsonSerializerOptions
- Path.Combine handles OS separators; GetExtension/GetFileName parse path components