(aka "edge functions"
or "edge workers")
Slides: bit.ly/edge-compute
austingil.com | @heyAustinGil
austingil.com | @heyAustinGil
No notes.
(aka "edge functions"
or "edge workers")
Slides: bit.ly/edge-compute
austingil.com | @heyAustinGil
I work at Akamai (akamai.com)
We offer a lot of relevant services
Really hard not to mention (kind of my job)
Impossible to be 100% unbiased
This is not a sales pitch in disguise
I’m here to teach practical, general concepts
I may mention Akamai because it’s familiar
But this info applies anywhere
Edge compute is the result of distributing serverless functions to multiple locations around the world to handle requests from as close to the user as possible.
The result is dynamic responses with the least amount of latency.

A mediocre analogy for delivering websites
…but really good photos ¯\_(ツ)_/¯
Compute = bleeps & bloops -> stuff (like HTML)
User request travels to a machine (own/rent) in a specific location which runs your code and return the HTML that’s rendered on the page.
(on-prem, VPS, cloud functions)

User downloads your code to run on THEIR machine to generate HTML.
(JavaScript, service workers, WASM)

Machine you control builds website ahead of time into static folders and files, allowing for immediate HTML response.
(technically still SSR)

(mostly)
Network of globally distributed computers that deliver STATIC assets closer to users, reducing latency and improving performance.

Can be IoT, cell networks, ISPs, you and your friend’s Raspberry Pis.
But usually, for the web it’s CDNs.
😀 Powerful + Dynamic
😟 Latency
😀 Latency + Dynamic
😟 Powerful
😀 Powerful + Latency
😟 Dynamic

Programmable runtimes (like cloud functions)...that are globally distributed (within a CDN)...and can cache content (like SSG).
Dynamic server-side functionality...that executes as close to users as possible...and may avoid repeat work.
They sit BETWEEN a client and origin server and have access to the client request and the server response.
Plus access to:
Dynamic responses with…
😱
😵
💻: client
🤵: origin server
🔪: edge node
🎯: target
💻---------------🤵-🎯vs.
💻-🔪---------------🎯💻---------------🤵---------------🎯vs.
💻-🔪---------------🎯(a.k.a API Orchestration)
┌----------------🎯
💻----------------🤵-🎯
└------🎯vs.
┌----------------🎯
💻-🔪---🎯
└------🎯(eg: Blog Post -> Catgeories -> Related Posts)
💻---------------🤵-🎯-🤵🎯-🤵---------------💻vs.
💻-🔪---------------🎯---------------🔪---------------🎯---------------🔪-💻🤨
🤑
Remember to consider latency, size, devices, location, data etc.
Client-side JS -> Client-side service worker -> Cloud functions -> Traditional servers
Client-side JS -> Client-side service worker -> Edge compute -> Cloud functions -> Traditional servers

I think it will be, but it needs to be easier.
It could cost them $5b and they would STILL make a profit
(Based on 1% of Amazon’s revenue and average developer salary for 2023)
(More legit case studies of performance vs. UX/biz metrics at wpostats.com)
Building for distributed systems is hard.
Edge compute adds complexity.
Small projects may not get as good of an ROI.
The important thing is to understand distributed systems and architecture.
Build applications for today, in such a way that can scale tomorrow.
And if tomorrow you tip those scales, give Akamai EdgeWorkers a try.
Edge compute, web development,
career, chiweenies, whatever :D