Is there a good alternative to github pages? I need just a static website up.
- I have a domain.
- I have my site (local machine)
- And that’s all I have.
- I have a machine that could be running 24/7 too.
- Any of https://staticsitegenerators.bevry.me/
- Any webserver + virtualhost config that serves plain HTML pages
- a build/upload script
I have not deployed Garage S3, but it has a static pages feature you could use — just buid your static files with jekyl or something, create a bucket and set the permissions.
So, uh…
Digital Ocean Is pretty inexpensive at US$7 monthly for 1 vCPU/1GB RAM with 1TB transfer. Decent platform. US-based, alas.
(2025 September, for the archives)
Oracle Cloud will give you far more for free.
Oracle Cloud will also delete your shit for the price of admission.
Caveat emptor, hey?
Mine has been running for years now without any such deletions.
And I genuinely hope it stays that way for years more to come. Cheers.
If you don’t care about uptime, self host it on the local machine you have and expose it through free cloudflare tunnels.
There’s actually a surprising amount of free static website hosting out there. Besides GitHub, GitLab, Cloudflare, and Netlify come to mind offhand.
Codeberg does too
Codeberg is not just for static websites. It’s for FOSS projects. Their FAQ addresses this.
Neocities?
I also thought about it, but the custom domain feature only works on the $5 / month plan.
AWS S3 lets you upload all content to a bucket, then mark it as a website. If usage is not too heavy, it can stay under the free tier.
But a favorite free one is Cloudflare pages: https://www.geeksforgeeks.org/techtips/deploying-static-website-to-cloudflare-pages/
You can keep your content on github, connect it to a CF page, and have it auto-update on push to github.
If you want free static hosting then probably: https://wasmer.io/
If you have the machine at home then you could set up port forwarding to it, but you would need to do everything yourself like:
- running a web server like nginx
- setting up ssl for it with certbot
- storing the static files in /var/www/html for example
- port forwarding from your router to that machine
- using some service like DuckDNS to point a domain to your dynamic IP at home
- pointing a CNAME to the DuckDNS subdomain on your domain
I built this for my personal use: https://git.prisma.moe/aichan/simple_web_editor
Something that may help:
Why doesn’t GitHub Pages fit your use case? It’s nice to get free static hosting from them.
AI encroachment
In what way? Anything on the public internet is likely being used for AI training. I guess by using free GitHub you can’t object to training.
Then again anywhere you host you sort of run into the same problem. You can use robots.txt, but things don’t have to listen to it.
Self-hosting there are some ways to fight back, or depending on your opinions on Cloudflare it seems they’re fairly effective at blocking the AI crawlers.
Yep, on top of simply blocking, if you’re self hosting or using cloudflare, you can enable AI tarpits.
How do I do this? I don’t mind (and may prefer) to host not at home. My main concern with GH is that you become an AI snack whether you like it or not.
Which part? If you’re wanting to use cloudflare pages, it’s relatively straightforward. You can follow this and get up & running pretty quickly: https://www.hongkiat.com/blog/host-static-website-cloudflare-pages/
If you’re asking about the tarpits, there’s two ways (generally) to accomplish that. Even if you don’t use cloudflare pages to host your site directly (if you use nginx on your server, for example), you can still enable AI tarpits for your entire domain, so long as you use cloudflare for your DNS provider. If you use pages, the setup is mostly the same: https://blog.cloudflare.com/ai-labyrinth/#how-to-use-ai-labyrinth-to-stop-ai-crawlers
If you want to do it all locally, you could instead setup iocaine or nepenthes which are both self hosted and can integrate with various webserver software. Obviously, cloudflare’s tarpits are stupid simple to setup compared to these, but these give you greater control of exactly how you’re poisoning the well and trapping crawlers.
Github, acquired by Microsoft, is now forcing AI on its user base.
That’s one of my main drivers to stay away from GH
I don’t want to serve my work in silver plate to theis AI.
Codeberg pages
I use nginx you can have configs for different sites and have the server_name have the domain for each server block (I use a file per site) and you can either do static via a root folder, or proxy_pass for active running servers, and nginx will map the domains to the server blocks you should also have a default, and you can then have multiple domains point to the same ip address, but keep in mind that home internet often has a dynamic ip, so you may need to update it every so often. There is a service to help with the dynamic ip I think noip.com has a solution available, but feel free to look around.
if you’ve already got something at home to run it on and want it easy to set up/maintain. take a look at mkdocs.
I recently used Jekyll (https://jekyllrb.com/) as a static site generator. I found it easy to use. I personally used Gitlab pages, because I didn’t feel confident hosting on my home internet (didn’t want to inadvertently cause issues for my housemates when I’m still learning this stuff).
The nice thing about static sites is that it’s pretty easy to find free or extremely cheap hosting for them.










