Categories: Technology

Artificial Intelligence can now rewrite obsolete Wikipedia texts

Wikipedia an almost essential point of reference for knowledge. However, some sections may be inaccurate or may not be updated adequately to the latest current events. And, as those who use it often know, this happens more easily for Italian pages than for English pages.

Some researchers from the MIT they wanted to tackle the problem of the freshness of information on Wikipedia. In fact, they developed an artificial intelligence system that automatically identifies the parts to be updated and modifies them so that the user who consults Wikipedia cannot sense a difference between the parts written by AI and the parts written by humans.

The system based onautomatic learning works on two modules and trained with pairs of sentences that can reiterate the same concept, be in contradiction or be neutral, in the sense that they do not convey enough information to be able to reach a conclusion. If the algorithm detects contradictions between the two sentences, it begins to identify which words to exclude and which, instead, to keep. This process is called "neutrality mask" and removes the minimum number of words needed to "maximize neutrality".

The second module a encoder-decoder framework which determines how to rewrite the Wikipedia phrase using simplified representations of the two sentences, the one considered obsolete by Wikipedia and the one with updated facts. This form said "fusion" because it operates on the different words between the two phases, inserting the new words in the points where contradictory words have been canceled.

The system developed by MIT can also be used in indoor detectors fake news, potentially reducing the distortion inherent in certain news. For technical details, please refer to the MIT website.

MIT submitted it to human judgment and obtained a rating of 4 out of 5 regarding the accuracy of updates and 3.85 out of 5 regarding the use of grammar. higher than others systems for automatic text generation, but it still does not reach perfection.

Technologies like this, on the other hand, can be transversely useful, and not only to keep Wikipedia updated. Theoretically, in fact, they could replace, or assist, even human editors, guaranteeing that impartiality and precision that is not always taken for granted in a human being.

Miners Hashrate

Recent Posts

Mining RTX 3070 at NiceHash: Overclocking, tuning, profitability, consumption

Mining on RTX 3070. Overclocking, tuning, profitability, consumption: If you are interested in finding more…

6 months ago

Mining GTX 1660, 1660 Ti, 1660 Super: Overclocking, settings, consumption

Mining with GTX 1660, 1660 Ti, 1660 Super. Overclocking, settings, consumption, profitability, comparisons - If…

6 months ago

Mining RTX 2070 and 2070 Super: Overclocking, profitability, consumption

Mining with RTX 2070 and 2070 Super. Overclocking, profitability, consumption, comparison What the RTX 2070…

6 months ago

Mining with RTX 3060, 3060 Ti. Limitations, overclocking, settings, consumption

Mining with RTX 3060, 3060 Ti. Limitations, overclocking, settings, consumption, profitability, comparison Let's look at…

6 months ago

Alphacool Eisblock Aurora Acryl GPX-A Sapphire – test: 2.8 GHz++ are not an issue

Alphacool Eisblock Aurora Acryl GPX-A (2022) with Sapphire Radeon RX 6950 XT Nitro+ Pure in…

6 months ago

Corporate Crypto Strategies 4.0: Leading with Bitcoin Expertise

In the ever-evolving landscape of business strategy, Bitcoin has emerged as a pivotal asset. With…

6 months ago

This website uses cookies.


Notice: ob_end_flush(): failed to send buffer of zlib output compression (1) in /home/gamefeve/bitcoinminershashrate.com/wp-includes/functions.php on line 5420

Notice: ob_end_flush(): failed to send buffer of zlib output compression (1) in /home/gamefeve/bitcoinminershashrate.com/wp-includes/functions.php on line 5420