Admin of a lot of fediverse servers, among which the .world ones:

  • lemmy.world
  • mastodon.world
  • calckey.world

You can find me on these servers as @ruud

I receive a lot of messages, direct and mentions. I can’t reply to them all. If you have an issue, please e-mail at info@lemmy.world

  • 41 Posts
  • 182 Comments
Joined 6M ago
cake
Cake day: Jun 01, 2023

help-circle
rss

Depends on what sort of invoices. For my invoices for billable hours, I use Kimai


I fixed that link, but actually lemmy.world/legal should redirect to legal.lemmy.world, which it apparently doesn’t always do. So we’ll look into that as well.

Thanks for pointing this out.


Yes, there has been quite a lot of trolls and attacks on our server, so we’re very careful about who we let on the team. I’m sure you’ll understand.


No. Just like now , they store the metadata there, the images are on disk or S3







That’s not the pictures… The picture take up 1.2 TB… it’s just the database with the metadata about the pictures. (11.8 GB now…)



It’s building a 0.4 database, which is already twice as big as the 0.3 one


OK, the pictrs database upgrade is taking it’s time… please wait… ;-)

pictrs_1     | {"timestamp":"2023-10-02T16:31:44.746467Z","level":"WARN","fields":{"message":"new"},"target":"pict_rs::repo","span":{"name":"Migrating Database from 0.3 layout to 0.4 layout"},"spans":[]}


(The thread. Not the firefish.mastodon.world, I don’t like that ;-) )




I've written a short blog about what happened in August, and the finances.
fedilink

I just setup a matrix server at lemmy.world, so just register an account on that if you like ;-)




Yes, that’s the plan. But determining which instances are suitable isn’t easy… I’ll ask them what’s the status on this.


Ah OK. Yeah lemmy had a big growth past months. Mastodons growth is mostly absorbed by .social



[Fixed] Comments temporarily broken, being worked on
There was another attack going on (as you might have noticed). We're working on a fix. In the meantime, we've blocked the listing of comments, so we at least aren't down, but it did break comments. Hope to have a fix in the next hour. Stay tuned! **Update** OK we've implemented a fix, again many thanks to [@sunaurus@lemm.ee](https://lemm.ee/u/sunaurus) for his assistance. This will prevent the outages we've seen last couple of days. Let's see what they will come up with next...
fedilink

A few days ago I saw some cool [JoinLemmy stickers](https://www.etsy.com/listing/1532398821/place-2023sticker-packsmotivational?click_key=202669597a80d6f60b5afb9b21260113b41e3b90%3A1532398821&click_sum=dc8e01d4&ref=shop_home_active_33) created by [@cheeseblintzes@lemmy.world](https://lemmy.world/u/cheeseblintzes) . I asked her if she could also create lemmy.world stickers, and she did! You can see and order them [here](https://www.etsy.com/listing/1535729033/lemmyworld-stickers-motivational), also check the [other cool stickers in her shop](https://www.etsy.com/shop/thespookyfarmer). Thanks for creating them!
fedilink

Outage today (2023-07-31) from 02:00 UTC - 05:45 UTC
Lemmy.world has been down between 02:00 UTC and 05:45 UTC. This was caused by the database spiking to 100% cpu (all 32 cores/64 threads!) due to inefficient queries been fired to the db very often. I’ve collected the logs and we’ll be checking how to prevent this. (And what caused this)
fedilink

[Done] Lemmy world was upgraded to 0.18.3 today (2023-07-30)
**Update** The upgrade was done, DB migrations took around 5 minutes. We'll keep an eye out for (new) issues but for now it seems to be OK. **Original message** We will upgrade lemmy.world to 0.18.3 today at 20:00 UTC+2 ([Check what this isn in your timezone](https://www.timeanddate.com/worldclock/fixedtime.html?msg=Lemmy.world+upgrade+to+lemmy+18.3&iso=20230730T20&p1=16)). Expect the site to be down for a few minutes. ""Edit"" I was warned it could be more than a few minutes. The database update might even take 30 minutes or longer. Release notes for 0.18.3 can be found here: https://github.com/LemmyNet/lemmy/blob/main/RELEASES.md (This is unrelated to the downtimes we experienced lately, those are caused by attacks that we're still looking into mitigating. Sorry for those)
fedilink

Lemmy.world update: Downtime today / Cloudflare
Today, like the past few days, we have had some downtime. Apparently some script kids are enjoying themselves by targeting our server (and others). Sorry for the inconvenience. Most of these 'attacks' are targeted at the database, but some are more ddos-like and can be mitigated by using a CDN. Some other Lemmy servers are using Cloudflare, so we know that works. Therefore we have chosen Cloudflare as CDN / DDOS protection platform for now. We will look into other options, but we needed something to be implemented asap. For the other attacks, we are using them to investigate and implement measures like rate limiting etc.
fedilink

As requested by some users: 'old' style now accessible via https://old.lemmy.world Code can be found here: https://github.com/rystaf/mlmym , created by [Ryan](https://github.com/rystaf) (Is he here?) (Yes he appears to be! [@nnrx@sh.itjust.works](https://sh.itjust.works/u/nnrx) ! Thanks for this awesome front-end!)
fedilink

Updated Voyager to 0.23.1 on m.lemmy.world
Thanks to [@aeharding@lemmy.world](https://lemmy.world/u/aeharding) for another release with awesome enhancements, see release notes here: https://lemmy.world/post/1558795
fedilink

I blogged about what happened in June, and the financial overview.
fedilink

I think I fixed the thumbnails issue :-)
It's always the small things you overlook... The `docker-compose.yml` I copied from somewhere when setting up lemmy.world apparently was missing the external network for the pictrs container.. So pictrs was working, as long as it got the images via Lemmy. Getting the images via URL didn't work... Looks like it's working now. Looks a whole lot better with all the images :-) **Edit** For existing posts: Edit the post, then Save. (No need to change anything). This also fetches the image.
fedilink

Lemmy.world updated to 0.18.2
lock
(Duplicate post :-) see https://lemmy.world/post/1375042)
fedilink

Voyager (fka wefwef) now available at m.lemmy.world
We've installed Voyager and it's reachable at https://m.lemmy.world, you can browse Lemmy, and login there (also if your account isn't on lemmy.world) **PS** Thanks go out to @stux[@stux@geddit.social](https://geddit.social/u/stux) , he came up with the idea (see https://m.geddit.social).
fedilink

Lemmy.world (and some others) were hacked
While I was asleep, apparently the site was hacked. Luckily, (big) part of the lemmy.world team is in US, and some early birds in EU also helped mitigate this. As I am told, this was the issue: - There is an vulnerability which was exploited - Several people had their JWT cookies leaked, including at least one admin - Attackers started changing site settings and posting fake announcements etc Our mitigations: - We removed the vulnerability - Deleted all comments and private messages that contained the exploit - Rotated JWT secret which invalidated all existing cookies The vulnerability will be fixed by the Lemmy devs. [Details of the vulnerability are here](https://lemmy.world/post/1293336) Many thanks for all that helped, and sorry for any inconvenience caused! **Update** While we believe the admins accounts were what they were after, it could be that other users accounts were compromised. Your cookie could have been 'stolen' and the hacker could have had access to your account, creating posts and comments under your name, and accessing/changing your settings (which shows your e-mail). For this, you would have had to be using lemmy.world at that time, and load a page that had the vulnerability in it.
fedilink

Lemmy.world updated to 0.18.1
We've updated Lemmy.world to Lemmy 0.18.1. For the release notes, see https://lemmy.world/post/1139237
fedilink

Some system load graphs of last 24h
For those who find it interesting, enjoy!
fedilink

Lemmy.world status update 2023-07-05
Another day, another update. More troubleshooting was done today. What did we do: - Yesterday evening @phiresky[@phiresky@lemmy.world](https://lemmy.world/u/phiresky) did some SQL troubleshooting with some of the lemmy.world admins. After that, phiresky submitted some PRs to github. - [@cetra3@lemmy.ml](https://lemmy.ml/u/cetra3) created a docker image containing 3PR's: [Disable retry queue](https://github.com/LemmyNet/lemmy/pull/3468), [Get follower Inbox Fix](https://github.com/LemmyNet/lemmy/pull/3482), [Admin Index Fix](https://github.com/LemmyNet/lemmy/pull/3483) - We started using this image, and saw a big drop in CPU usage and disk load. - We saw thousands of errors per minute in the nginx log for old clients trying to access the websockets (which were removed in 0.18), so we added a `return 404` in nginx conf for `/api/v3/ws`. - We updated lemmy-ui from RC7 to RC10 which fixed a lot, among which the issue with replying to DMs - We found that the many 502-errors were caused by an issue in Lemmy/markdown-it.actix or whatever, causing nginx to temporarily mark an upstream to be dead. As a workaround we can either 1.) Only use 1 container or 2.) set ~~`proxy_next_upstream timeout;`~~ `max_fails=5` in nginx. Currently we're running with 1 lemmy container, so the 502-errors are completely gone so far, and because of the fixes in the Lemmy code everything seems to be running smooth. If needed we could spin up a second lemmy container using the ~~`proxy_next_upstream timeout;`~~ `max_fails=5` workaround but for now it seems to hold with 1. Thanks to [@phiresky@lemmy.world](https://lemmy.world/u/phiresky) , [@cetra3@lemmy.ml](https://lemmy.ml/u/cetra3) , [@stanford@discuss.as200950.com](https://discuss.as200950.com/u/stanford), [@db0@lemmy.dbzer0.com](https://lemmy.dbzer0.com/u/db0) , [@jelloeater85@lemmy.world](https://lemmy.world/u/jelloeater85) , [@TragicNotCute@lemmy.world](https://lemmy.world/u/TragicNotCute) for their help! And not to forget, thanks to [@nutomic@lemmy.ml](https://lemmy.ml/u/nutomic) and [@dessalines@lemmy.ml](https://lemmy.ml/u/dessalines) for their continuing hard work on Lemmy! And thank you all for your patience, we'll keep working on it! Oh, and as bonus, an image (thanks Phiresky!) of the change in bandwidth after implementing the new Lemmy docker image with the PRs. ![](https://lemmy.world/pictrs/image/166fc6d9-972d-4ff2-aa3a-b2ecbbb90cd5.png) **Edit** So as soon as the US folks wake up (hi!) we seem to need the second Lemmy container for performance. So that's now started, and I noticed the `proxy_next_upstream timeout` setting didn't work (or I didn't set it properly) so I used `max_fails=5` for each upstream, that does actually work.
fedilink

It’s a single server with 32core/64 thread AMD EPYC and 128GB RAM. At the moment we run multiple containers for lemmy so restarting doesn’t mean outage.


Same happened with mastodon.world in November. Family goes first, then work, and then all of my hobbies, of which this is one. (But the one taking up most time at the moment…)



I think I use all chat software there is. I’m in hundreds of Matrix rooms. But I think one of the team at least didn’t like or use Matrix. Don’t remember. And I have Discord anyway for the Mastodon channels…



We have 128GB of RAM. It just skyrockets after a while!


Yes he’s one of the other admins in our Discord, he’s very helpful!


Lemmy.world status update 2023-07-04
# Status update July 4th Just wanted to let you know where we are with Lemmy.world. ## Issues As you might have noticed, things still won't work as desired.. we see several issues: ### Performance - Loading is mostly OK, but sometimes things take forever - We (and you) see many 502 errors, resulting in empty pages etc. - System load: The server is roughly at 60% cpu usage and around 25GB RAM usage. (That is, if we restart Lemmy every 30 minutes. Else memory will go to 100%) ### Bugs - Replying to a DM doesn't seem to work. When hitting reply, you get a box with the original message which you can edit and save (which does nothing) - 2FA seems to be a problem for many people. It doesn't always work as expected. ## Troubleshooting We have many people helping us, with (site) moderation, sysadmin, troubleshooting, advise etc. There currently are 25 people in our Discord, including admins of other servers. In the Sysadmin channel we are with 8 people. We do troubleshooting sessions with these, and sometimes others. One of the Lemmy devs, [@nutomic@lemmy.ml](https://lemmy.ml/u/nutomic) is also helping with current issues. So, all is not yet running smoothly as we hoped, but with all this help we'll surely get there! Also thank you all for the donations, this helps giving the possibility to use the hardware and tools needed to keep Lemmy.world running!
fedilink

Sorry hadn’t seen the message. Still interested?


Yeah with nginx doing load balancing


Ohh 1MB is too small. I’ll look into that



This is lemmy.world after 4 weeks:

58G	pictrs
34G	postgres


Need support?
pin
If you need support, best is to not DM me here or mention me in comments. I now have 300 notifications and probably no time to read them soon. Also I don’t do moderation so any moderation questions I have to forward to the moderation team. ## where to get support There’s the [!support@lemmy.world](https://lemmy.world/c/support) community, and another option is to send mail to info@lemmy.world. Mail is converted to tickets which can be picked up by admins and moderators. Thanks! Enjoy your day!
fedilink


Well we now have 3 lemmy containers and I have the feeling some are faster than others…



On a physical server in a datacenter


Lemmy.world updated to 0.18.1-rc
Looks like it works. **Edit still see some performance issues. Needs more troubleshooting** **Update: Registrations re-opened** We encountered a bug where people could not log in, see https://github.com/LemmyNet/lemmy/issues/3422#issuecomment-1616112264 . As a workaround we opened registrations. ## Thanks First of all, I would like to thank the Lemmy.world team and the 2 admins of other servers [@stanford@discuss.as200950.com](https://discuss.as200950.com/u/stanford) and [@sunaurus@lemm.ee](https://lemm.ee/u/sunaurus) for their help! We did some thorough troubleshooting to get this working! ## The upgrade The upgrade itself isn't too hard. Create a backup, and then change the image names in the `docker-compose.yml` and restart. But, like the first 2 tries, after a few minutes the site started getting slow until it stopped responding. Then the troubleshooting started. ## The solutions What I had noticed previously, is that the lemmy container could reach around 1500% CPU usage, above that the site got slow. Which is weird, because the server has 64 threads, so 6400% should be the max. So we tried what [@sunaurus@lemm.ee](https://lemm.ee/u/sunaurus) had suggested before: we created extra lemmy containers to spread the load. (And extra lemmy-ui containers). And used nginx to load balance between them. Et voilà. That seems to work. Also, as suggested by him, we start the lemmy containers with the scheduler disabled, and have 1 extra lemmy running with the scheduler enabled, unused for other stuff. There will be room for improvement, and probably new bugs, but we're very happy lemmy.world is now at 0.18.1-rc. This fixes a lot of bugs.
fedilink

[Done] New try at upgrading to 0.18.1 July 1st 20:00 CET
We'll give the upgrade new try tomorrow. I've had some good input from admins of other instances, which are also gonna help troubleshoot during/after the upgrade. Also there are newer RC versions with fixed issues. Be aware that might we need to rollback again, posts posted between the upgrade and the rollback will be lost. We see a huge rise in new user signups (duh.. it's July 1st) which also stresses the server. Let's hope the improvements in 0.18.1 will also help with that.
fedilink

Federation troubleshooting
So I've been troubleshooting the federation issues with some other admins: ![](https://lemmy.world/pictrs/image/4a23a8dd-4141-4672-b95c-38e0708f6079.png) (Thanks for the help) So what we see is that when there are many federation workers running at the same time, they get too slow, causing them to timeout and fail. I had federation workers set to 200000. I've now lowered that to 8192, and set the activitypub logging to debugging to get queue stats. `RUST_LOG="warn,lemmy_server=warn,lemmy_api=warn,lemmy_api_common=warn,lemmy_api_crud=warn,lemmy_apub=warn,lemmy_db_schema=warn,lemmy_db_views=warn,lemmy_db_views_actor=warn,lemmy_db_views_moderator=warn,lemmy_routes=warn,lemmy_utils=warn,lemmy_websocket=warn,activitypub_federation=debug"` Also, I saw that there were many workers retrying to servers that are unreachable. So, I've blocked some of these servers: ``` commallama.social,mayheminc.win,lemmy.name,lm.runnerd.net,frostbyrne.io,be-lemmy.org,lemmonade.marbledfennec.net,lemmy.sarcasticdeveloper.com,lemmy.kosapps.com,pawb.social,kbin.wageoffsite.com,lemmy.iswhereits.at,lemmy.easfrq.live,lemmy.friheter.com,lmy.rndmm.us,kbin.korgen.xyz ``` This gave good results, way less active workers, so less timeouts. (I see that above 3000 active workers, timeouts start). (If you own one of these servers, let me know once it's back up, so I can un-block it) Now it's after midnight so I'm going to bed. Surely more troubleshooting will follow tomorrow and in the weekend. Please let me know if you see improvements, or have many issues still.
fedilink

[Update: Failed again] Update to 0.18.1-rc.1 tried and rolled back
We've upgraded lemmy.world to 0.18.1-rc.1 and rolled back that upgrade because of issues. (If you had posted anything in those 10 minutes between upgrade and rollback, that post is gone. Sorry!) The main issue we saw is that users can't login anymore. Existing sessions still worked, but new logins failed (from macos, ios and android. From linux and windows it worked) Also new account creation didn't work. I'll create an issue for the devs and retry once it's fixed. **Edit** Contacted the devs, they tell me to try again with lemmy-ui at version 0.18.0. Will try again, brace for some downtime! **Edit 2** So we upgraded again, and it seemed to work nicely! But then it slowed down so much it was unuseable. There were many locks in the database. People reported many JSON errors. Sorry, we won't be on 0.18.1 any time soon I'm afraid..
fedilink

Lemmy.world upgraded to 0.18.1-rc.1
pin
We've upgraded the instance to 0.18.1-rc.1 (to be completed)
fedilink

Jerboa app and Lemmy 0.18
The 0.18 version of Lemmy was [announced](https://join-lemmy.org/news/2023-06-23_-_Lemmy_Release_v0.18.0). This will solve many issues. **But we can't upgrade yet** because the captcha was removed, and captcha relied on Websockets, which are removed in 0.18 so despite the devs agreeing on [my request to add captcha back](https://github.com/LemmyNet/lemmy/issues/3200), this will not be until 0.18.1. Without captcha we will be overrun by bots. Hopefully this 0.18.1 will be released soon, because another issue is that the newest version of the Jerboa app won't work with servers older than 0.18. So if you're on Lemmy.world, please (temporarily) use another app or the web version.
fedilink

Added some more known isues
I added some known issues with websockets / spinning wheel to the [known issues post](https://lemmy.world/post/15786)
fedilink

I wrote my fist post about Lemmy.world. When June is finished, I'll also include Lemmy in the financial update on the same blog.
fedilink

[Solved] Temporarily closed signups because of spam signups
So some spam signups just happened (all username12345678@gmail.com format e-mail) This caused bounced mail to increase, causing Mailgun to block our domain to prevent it getting blacklisted. So: - Mail temporarily doesn't work - I closed signups for now - I will ban the spam accounts - I will check how to prevent (maybe approval required again?) Stay tuned. **Edit**: so apparently there is a captcha option which I now enabled. Let's see if this prevents spam. Registrations open again. **Edit2** : Hmm Mailgun isn't that fast in unblocking the domain. Closing signups again because validation mails aren't sent **Edit 3**: I convinced Mailgun to lift the block. Signups open again.
fedilink

Posting slowness issue seems solved!
Thanks to a comment by [@LargeHardonCollider@lemmy.world](https://lemmy.world/u/LargeHardonCollider) , I checked and saw that 'Federation debugging' mode was enabled. I had enabled that when the server just started (less than 3 weeks ago) and I had an issue with federation. I thought I had switched that off again, but apparently not. This mode causes the federation to be done in the foreground, so your 'Post' or 'Comment' action will wait for that to finish... This solves the most annoying issue, and makes the site way more useable. There are many other issues, but we'll get there.
fedilink

ipv6 enabled
I enabled the ipv6 address for lemmy.world. Should work now. Next step would be enable dnssec, have to figure out how that worked again.
fedilink

[Guess not…] Installed lemmy-ui 0.18.0 RC-1
I just installed the 0.18.0 release candidate 1 of the lemmy-ui component. This version removes websockets and should solve many strange issues. Like the glitching vote totals, sudden changes of posts etc. Let me know if you see improvements, or new issues.
fedilink

Lemmy.world About post / Rules / FAQ.
lock
To be created # About # Rules # FAQ
fedilink

Workaround for the performance issue with posting in large communities
pin
We're still working to find a solution for the posting slowness in large communities. We have seen that a post does get submitted right away, but yet the page keeps 'spinning' So right after you clicked 'Post' or 'Reply' you can refresh the page and the post should be there. (But maybe to be sure you could copy the contents of your post first, so you can paste again if anything would go wrong..)
fedilink

[Done] Server will be migrated (More power!)
So after we've extended the virtual cloud server twice, we're at the max for the current configuration. And with this crazy growth (almost 12k users!!) even now the server is more and more reaching capacity. Therefore I decided to order a dedicated server. Same one as used for mastodon.world. So the bad news... we will need some downtime. Hopefully, not too much. I will prepare the new server, copy (rsync) stuff over, stop Lemmy, do last rsync and change the DNS. If all goes well it would take maybe 10 minutes downtime, 30 at most. (With mastodon.world it took 20 minutes, mainly because of a typo :-) ) For those who would like to donate, to cover server costs, you can do so at our [OpenCollective](https://opencollective.com/mastodonworld) or [Patreon](https://patreon.com/mastodonworld) Thanks! **Update** The server was migrated. It took around 4 minutes downtime. For those who asked, it now uses a dedicated server with a AMD EPYC 7502P 32 Cores "Rome" CPU and 128GB RAM. Should be enough for now. I will be tuning the database a bit, so that should give some extra seconds of downtime, but just refresh and it's back. After that I'll investigate further to the cause of the slow posting. Thanks [@veroxii@lemmy.world](https://lemmy.world/u/veroxii) for assisting with that.
fedilink

[Done - for now…] Expect some brief restarts today (Jun 12 CET)
I'm trying to fix this annoying slowness when posting to larger communities. (Just try replying here...) I'll be doing some restarts of the docker stack and nginx. Sorry for the inconvenience. **Edit**: Well I've changed the nginx from running in a docker container to running on the host, but that hasn't solved the posting slowness..
fedilink

Lemmy.world starting guide
(I'm creating a starting guide post here. Have patience, it will take some time...) **Disclaimer**: I am new to Lemmy like most of you. Still finding my way. If you see something that isn't right, let me know. Also additions, please comment! # Welcome! Welcome to Lemmy (on whichever server you're reading this) # About Lemmy Lemmy is a federated platform for news aggregagtion / discussion. It's being developed by the Lemmy devs: https://github.com/LemmyNet ## About Federation What does this federation mean? It means Lemmy is using a protocol (Activitypub) which makes it possible for all Lemmy servers to interact. - You can search and view communities on remote servers from here - You can create posts in remote communities - You can respond to remote posts - You will be notified (if you wish) of comments on your remote posts - You can follow Lemmy users/communities on other platforms that also use Activitypub (like Mastodon, Calckey etc) (There's currently a known issue with that, see [here](https://lemmy.world/post/15786) Please note that a server only starts indexing a server/community once it has been interacted with by a user of this server. A great image describing this, made by [@ulu_mulu@lemmy.world](https://lemmy.world/u/ulu_mulu) : https://imgur.com/a/uyoYySY ![](https://lemmy.world/pictrs/image/0006c2db-13c6-406e-97e7-6e274fddf355.png) # About Lemmy.world Lemmy.world is one of the many servers hosting the Lemmy software. It was started on June 1st, 2023 by [@ruud@lemmy.world](https://lemmy.world/u/ruud) , who is also running https://mastodon.world, https://calckey.world and others. A list of Lemmy servers and their statistics can be found at [FediDB ](https://fedidb.org/software/lemmy) # Quick start guide ## Account You can use your account you created to log in to the server on which you created it. Not on other servers. Content is federated to other servers, users/accounts are **not**. ## Searching In the top menu, you'll see the search icon. There, you can search for posts, communities etc. ![](https://lemmy.world/pictrs/image/1cd03dea-443b-4a92-ba87-5b45561200fd.png) You can just enter a search-word and it will find the Post-titles, post-content, communities etc containing that word **that the server knows of**. So any content any user of this server ever interacted with. You can also search for a community by it's link, e.g. `!Netherlands@lemmy.nl`. Even if the server hasn't ever seen that community, it will look it up remotely. Sometimes it takes some time for it to fetch the info (and displays 'No results' meanwhile..) so just be patient and search a second time after a few seconds. ## Creating communities First, make sure the community doesn't already exist. Use search (see above). Also try [https://browse.feddit.de/](https://browse.feddit.de/) to see if there are remote communities on other Lemmy instances that aren't known to Lemmy.world yet. If you're sure it doesn't exist yet, go to the homepage and click 'Create a Community'. ![](https://lemmy.world/pictrs/image/d49e3218-fcee-4dc6-8879-7b5a4986da4d.png) It will open up the following page: ![](https://lemmy.world/pictrs/image/b03c9fb1-69ba-43b5-985f-97c3820e146a.png) Here you can fill out: - Name: should be all lowercase letters. This will be the /c/ - Display name: As to be expected, this will be the displayed name. - You can upload an icon and banner image. Looks pretty. - The sidebar should contain things like description, rules, links etc. You can use Markdown (yey!) - If the community will contain mainly NSFW content, check the NSFW mark. NSFW is allowed as long as it doesn't break [the rules](https://mastodon.world/about) - If you only want moderators to be able to post, check that checkbox. - Select any language you want people to be able to post in. Apparently you shouldn't de-select 'Undetermined'. I was told some apps use 'Undetermined' as default language so don't work if you don't have it selected ## Reading I think the reading is obvious. Just click the post and you can read it. SOmetimes when there are many comments, they will partly be collapsed. ## Posting When viewing a community, you can create a new post in it. First of all make sure to check the community's rules, probably stated in the sidebar. ![](https://lemmy.world/pictrs/image/bf81a5f5-997d-42e0-8544-5051cf9657d7.png) In the Create Post page these are the fields: - URL: Here you can paste a link which will be shown at the top of the post. Also the thumbnail of the post will link there. **Alternatively** you can upload an image using the image icon to the right of the field. That image will also be displayed as thumbnail for the post. - Title: The title of the post. - Body: Here you can type your post. You can use Markdown if you want. - Community: select the community where you want this post created, defaults to the community you were in when you clicked 'create post' - NSFW: Select this if you post any NSFW material, this blurs the thumbnail and displays 'NSFW' behind the post title. - Language: Specify in which language your post is. Also see the [Lemmy documentation](https://join-lemmy.org/docs/en/users/02-media.html) on formatting etc. ## Commenting ## Moderating / Reporting ## Client apps There are some apps available or in testing. See [this post](https://lemmy.world/post/465785) for a list! # Issues When you find any issue, please report so here: https://lemmy.world/post/15786 if you think it's server related (or not sure). Report any issues or improvement requests for the Lemmy software itself here: https://github.com/LemmyNet ## Known issues Known issues can be found in the beforementioned post, one of the most annoying ones is the fact that post/reply in a somewhat larger community can take up to 10 seconds. It seems like that's related to the number of subscribers of the community. I'll be looking into that one, and hope the devs are too.
fedilink

Who wants to moderate SelfHosted community?
lock
Looking for help with moderation, I have my hands full administering this server ;-) Requirements: - Need to have read and agree with the rules (https://mastodon.world/about - Need a little bit of time to keep an eye here
fedilink

1000 users!
lemmy.world just reached 1000 users. Please remember that the server was created June 1st! So still might notice some startup issues... but so far so good! Welcome @all!
fedilink

Lemmy.world improvements and issues
In this post I will list the known issues or possible improvements for Lemmy.world. Please comment with any issue or area for improvement you see and I will add it here. Remember: this instance was only started June 1st so a lot of troubleshooting and tweaking to be done. Issues can be: - Local (lemmy.world) (also performance issues) - Lemmy software issues - Other software related (apps/Fediverse platforms etc) - Remote server related - (User error? ...) ## Known issues ### Websockets issues There are some issues with the Websockets implementation used in Lemmy, which handles the streaming. Websockets will be removed in version 0.18 so let's hope these issues will be all gone then! - Top posts page gets a stream of new posts ? Websockets issue - You're suddenly in another post than you were before > Websockets issue - Your profile will briefly display another name/avatar in the top right corner ### Spinning wheel issues Error handling is not one of Lemmy's strongpoints. Sometimes something goes wrong, but instead of getting an error, the button will have a 'spinning wheel' that lasts until eternity. These are some of the known cases: - You want to create an account but the username is already taken - You want to create an account but the username is too long (>20 characters) - You want to create an account but the password is too long - You want to create a community but the name is already taken - You want to create a community but the name is not in all lowercase letters - You want to create a post over 2000 characters - You want to post something in a language that isn't allowed in the community ## Other issues - Federation not always working; Apparently not everything gets synced all the time. This needs troubleshooting. - “404: FetchError: invalid json response body at http://lemmy:8536/api/v3/site” This sometimes happens when the Lemmy app container is very busy. Needs troubleshooting ## Enhancement requests - Can themes be added? > To be checked if this can be done without changing code. For support with issues at Lemmy.world, go to [the Lemmy.world Support community](https://lemmy.world/c/support).
fedilink

Woo-hoo! 100 users!!
I see we've just reached 100 users!! In 5 days..
fedilink

Some federation issues [solved]
I still see some federation issues: - It sometimes takes a few tries before a remote post or community is found - Remote replies don't show up - Subscriptions to remote communties are stuck in 'pending' I'll look into that.
fedilink