Will talk about Linux, plants, space, retro games, and anything else I find interesting.
Also mesa@piefed.social over on Piefed.
I found out via a matrix channel lol. Its almost never used except by some close friends so its nice we have a backup.
Yep agreed. I guess I just had a misconception at first.
The only “issue” I have with libre is its essentially a full pull of Firefox nightly with some rust patches on top. Its reliant on Firefox, so its not really a “new” browser per-say.
That being said, I use it everyday :) Its an excellent project.
Looks like its down slightly too.
It would be ironic if businesses come back to California because of the tariffs.
I’m already seeing some huge increases on ic chips online here…
Love it
Yeah its VERY expensive and only useful for a very niche setup. But its cool we can have options like this in the first place.
Unfortunately it’s sold out. In fact it was only available for a very small amount of time.
I used fail2ban + router to block the ip addresses. Then if the headers come from openai, they also get bounced.
Below is a template I used that I created on the fly for an AI black hole that I also made. Its decent, but I feel like it could be better.
from flask import Flask, request, redirect, render_template_string
import time
from collections import defaultdict
import random
app = Flask(__name__)
# Data structure to keep track of requests per IP
ip_requests = defaultdict(list)
IP_REQUEST_THRESHOLD = 1000 # Requests threshold for one hour
TIME_WINDOW = 3600 # Time window of one hour in seconds
# Function to track and limit requests based on IP
def track_requests(ip):
current_time = time.time()
ip_requests[ip] = [t for t in ip_requests[ip] if current_time - t < TIME_WINDOW] # Remove old requests
ip_requests[ip].append(current_time)
return len(ip_requests[ip])
# Serve slow pages incrementally
@app.route('/')
def index():
ip = request.remote_addr
request_count = track_requests(ip)
if request_count > IP_REQUEST_THRESHOLD:
return serve_slow_page(request_count)
else:
return 'Welcome to the site!'
def serve_slow_page(request_count):
"""Serve a progressively slower page."""
delay = min(10, request_count / 1000) # Slow down incrementally, max 10 seconds delay
time.sleep(delay) # Delay to slow down the request
# Generate the next "black hole" link
next_page_link = f'/slow/{random.randint(1000, 9999)}'
html_content = f"""
<html>
<head><title>Slowing You Down...</title></head>
<body>
<h1>You are being slowed down!</h1>
<p>This is taking longer than usual because you're making too many requests.</p>
<p>You have made more than {IP_REQUEST_THRESHOLD} requests in the past hour.</p>
<p>Next step: <a href="{next_page_link}">Click here for the next page...</a></p>
</body>
</html>
"""
return render_template_string(html_content)
@app.route('/slow/<int:page_id>')
def slow_page(page_id):
ip = request.remote_addr
request_count = track_requests(ip)
if request_count > IP_REQUEST_THRESHOLD:
return serve_slow_page(request_count)
else:
return 'Welcome back to normal!'
if __name__ == '__main__':
app.run(debug=True)
Yep same thing. I have some small servers and was getting hammered by openai ip controlled ai crawlers not respecting robots.txt. had to block all their IP addresses and create an AI black hole in order to stop them ddos ing my tiny site(s).
You can say suicide on the fediverse unlike other platforms.
It’s terrible what ss is doing.