Email or username:

Password:

Forgot your password?
Wikipedia

Hello Fediverse! 👋

#Wikipedia :wikipedia: is a multilingual free online encyclopedia written and maintained by a community of volunteers, known as Wikipedians, through open collaboration and using a wiki-based editing system called MediaWiki. -- or so says our Wikipedia article (w.wiki/W).

Follow for cool facts, behind-the-scenes details on editing processes and tips on getting started contributing!

#Introduction #FreeCulture #FreeContent #OpenKnowledge #wiki #CreativeCommons

14 comments
Wikipedia

There are currently 332 language editions of #Wikipedia (again, per Wikipedia) - we'll do our best to bring multilingual content, but we would appreciate help from Wikipedians involved in non-English language editions with suggesting content to post or translating existing posts.

This account is community-run by Wikipedians (like you!) - see meta.wikimedia.org/wiki/@Wikip for details.

Please help us in creating a Mastodon account that anyone can edit :-)

Thaiis Thei 𓁟

@wikipedia Hey Wikipedia, when are you going to do something about the flagrant and blatant abuse of your rules by groups such as the Guerilla Skeptics?

Nate M

@Oozenet @wikipedia You could do something. After all, anyone can edit.

Thaiis Thei 𓁟

@natemtn @wikipedia What a trite reply. I have been an editor for over 16 years. I have fought many fights on Wikipedia to fix problems but there are too many paid editors and trolls with agendas now. They break the rules and support each other in doing so.

Sebastian Lasse

@wikipedia

We need to meet about “federating wikidata” and thus wikidata.
It is easy and also @maxlath has done so many things in the field.
The ActivityPub protocol which we are writing here is multilanguage too.
"nameMap" = label
"summaryMap" = description
"contentMap" = wikipedia
Any claim or qualified statement from wikidata can be a "Relationship" (with e.g. startTime endTime etc.) –
all like in wikibase.

We can really do better than mastodon.
You know what languages your user speaks. Even if no settings, navigator is your friend.
I wrote github.com/redaktor/languages (852) and for browser you send user the language-fingerprints they need and your system knows always in what they type.
Any claim can federate, any top instance is a `type` etc.
The thing which I write for the fediverse will benefit from wikidata multilanguage. Also from wikidata cause any item can become a “Topic” [like a Hashtag as a Service Actor, where you can subscribe to].
Currently I am doing a Service Actor for federated geocoding.

@wikipedia

We need to meet about “federating wikidata” and thus wikidata.
It is easy and also @maxlath has done so many things in the field.
The ActivityPub protocol which we are writing here is multilanguage too.
"nameMap" = label
"summaryMap" = description
"contentMap" = wikipedia
Any claim or qualified statement from wikidata can be a "Relationship" (with e.g. startTime endTime etc.) –
all like in wikibase.

maxlath

@sl007 nice language detection lib! I was trying to get somewhere in that direction with just unicode github.com/maxlath/unicode-scr, but this looks much more powerful! What is the source for the fingerprints?

Sebastian Lasse

@maxlath

Most powerful would probably be: both :)
Also we need to mention that of the 850 languages only 400 are really “spoken” in the languages.
It was driven by the desire to give voice to minor languages and so after the reportages in Papua New Guinea the source was tinkering, as far as I remember about 400 were trained from wikibase and about 100 were trained from a mix of specific dictionaries and web sources.
We could derive the “language variants” by BCP-47 blowing it up to maybe 700.
The rest (about 150, mostly minor languages) was driven by desire to find anything you can find in this english-dominated internet, even in papers about the language.

Anyway: My belief is that any fedi onboarding [just in: smashingmagazine.com/2023/04/d]
should include the user saying
This is my native language (100%) and I speak these 10%-99% perfect …
Then it will get precise cause it limits it to the selection where we should have enough difference.
🧵 1/2

@maxlath

Most powerful would probably be: both :)
Also we need to mention that of the 850 languages only 400 are really “spoken” in the languages.
It was driven by the desire to give voice to minor languages and so after the reportages in Papua New Guinea the source was tinkering, as far as I remember about 400 were trained from wikibase and about 100 were trained from a mix of specific dictionaries and web sources.
We could derive the “language variants” by BCP-47 blowing it up to maybe 700.
The...

Sebastian Lasse

@maxlath

Otherwise the languages are limited to the script used (I think, it didn't cover mixed scripts, something to think of).

Looking into yours now.

Go Up