Wikipedia, which is a free, multilingual online encyclopedia, has made a new AI tool to check the information in thousands of citations at once.
We Need Bibliometrics
In the Wikipedia AI database, there are more than 4 million citations. For users and become satisfied, they need to see links to sources that support the claims made. A Wikipedia article says that President Obama went to Europe and then to Kenya, where he met his father’s ancestors for the first time. It’s important to include citations and links so that people can check that the information above comes from reliable sources.
Linking to outside resources is helpful because it gives the articles more context, but it can’t take the place of explanations in the text. When you click on a link, you often end up somewhere that has nothing to do with the link. They give up on the topic or change to something else while reading it.
Meta Begins Work on an AI Tool
Joe Hipp was the first American heavyweight to fight in WBA (World Boxing Association) matches. A Wikipedia article tells his story. The article didn’t say anything about Joe Hipp or boxing. Instead, it talked about how he was the first Native American to challenge the WBA.
Joe Hipp came up with the idea that shady sources on Wikipedia can lead people astray. The expansion of false information across the globe is now possible. Because of this, Facebook’s Meta AI has started working with the Wikimedia Foundation (a development research lab for the social media giant). They said that their model was the first machine learning system that could handle thousands of citations at the same time.
Checking each citation by hand would take too long, so this is a great way to save time.
Work on meta-AI
Fabio Petroni, the research tech lead manager at Meta AI, told Digital Trends the following:
I think that curiosity was what drove us in the end. So, we set out to see how far this technology could go. We didn’t even know if this AI would help in this situation. Everyone knows this is a first.
In particular, he explained what this instrument was used for:
We used these models to make an index of all of these web pages, breaking them up into sections and giving an accurate representation for each passage that doesn’t show the text word-for-word but rather the meaning of the passage. This keeps similar text chunks close together in the n-dimensional space.
Petroni says that they are working right now to make it shorter:
Our prototype proves that the idea can be done. It isn’t much you can do with it at the moment. For this to be useful, there needs to be an updated index with more information. Since new information comes out every day, it must always be changing.
As shown here, the AI tool can process not only text but also sound and video. It will be useful for sharing on sites where photos and videos can be shown.