2.6 Performance
Performance is
- time required to process something (latency)
- rate at which something is processed (throughput)
Response time = network delay + service time
- response time = client-side latency
- network delay = network latency
- service time = server-side latency
Network latency
- OSI model
- network protocols
Server-side latency
- faster algorithms
- memory versus disk trade-offs
- tread pools and parallel processing
- local cache
Client-side latency
- blocking versus non-blocking I/O
- message formats
- data compression
- CDN
- external cache