last month, i posted about my website basement community on hacker news and it sorta blew up. you can view the post here. it blew up so much, i was totally unprepared for what came with all that traffic, so without further ado, here are 4 things i learned running a website with mostly user-generated content:

1. someone is going to abuse your site

this one may sound pretty obvious. you may have heard about XSS attacks or maybe even stories of people brute forcing their way into production servers but what i’m talking about is an attack that you might not even consider when running a social media site: dumb-ass people being really obnoxious

when the site first got a surge of users from hacker news, there was one poster in particular who came to the site, registered a bunch of offensive, racist usernames and proceeded to post and create threads that were just full of dumb slurs. this was definitely a learning experience because i had to act quickly, so i tried a bunch of different methods to get rid of him.

  1. i banned his account

    • this only worked momentarily. his account was banned, but nothing stopped him from from registering a completely new account and continuing to post racist shit
  2. i blocked his IP address

    • this worked for a little bit longer, but he proceeded to get on a VPN, and then another when i blocked that IP, then another when i blocked that IP, etc, etc.
  3. i added a site setting to turn off registrations entirely

    • this was a good band-aid solution, but it wasn’t great. i wanted to capitalize on the traffic that was still coming in, so i later allowed people to register, but flagged their account as “unapproved.” then by hand, i went and manually approved each account i thought was legit.
  4. i created a denylist for usernames

    • this feature came much later, but one of the last things i did to mitigate any more abuse was to create a denylist of words so that people couldn’t register racist or offensive usernames

2. you need a performance monitoring system

this also may seem like a no-brainer, but when hacker news ran the traffic up on basement community, the site did not handle the load well and i had almost zero insight into what was going wrong. are the SQL queries too expensive? was the main document taking too long to load? is there too much JavaScript running? i have no idea. that’s why very shortly after seeing the traffic, i implemented Sentry into both my front-end and back-end code so i can get a full picture on where the problems lie, which leads to my next point

3. your SQL queries need to be optimized

after getting some logs to work with, i noticed almost instantly that a lot of performance issues were stemming from the fact that i had way too many SQL queries running to display a simple page. this is mostly because i relied on a SQL ORM which in short is a tool that makes writing SQL easier to pick up and faster to develop. the biggest downside is that it might execute 50 queries to your database to get a list of information, when it probably only needs 1, which will cause slowdown.

this was clearly a problem and i have been tackling it in piecemeal – optimizing each page at a time. since the initial rush of traffic, i’m pleased to say that the load times for the site have gotten much better.

4. your users might have genius ideas

finally, the last note i’ll make here is about listening to feedback. since launching the site, some users have suggested pretty smart features that i’ve since implemented, like this back-to-top button to quickly get back to the top of the page:

back to top button

or even non-technical features, such as creating a forum for reading, which i’ve also added.

point is, listen to your users. they might have better ideas than you!


like this post? did i get anything wrong? discuss it on the forums at basementcommunity.com

Read More