The networked Internet connects the world.
The networked Internet makes it possible to get and analyze information with enormous speed. When we put something into the net, that's hard to remove. And that is the blessing and curse of that international network. The Internet transmits only information. Information is a virtual tool.
The data system can multiply data into billions of computers. Information, that robots and other systems get are the tools required. The information that the robot gets acts as a trigger. And that trigger launches some reaction that depends on what is stored in the system's memory blocks.
The information is a tool that controls the machines and other things connected to the Internet. The fact is that nothing is connected to the net. The net is the only platform that connects physical computers under one banner. The net is the thing that interconnects servers together. That makes it possible to create the cloud.
The cloud means a virtual supercomputer that interacts similar way as one monolithic supercomputer. The difference between a monolithic structure and cloud-based multi-core architecture is that there is no limit to the size of the cloud. Theoretically, we can connect an unlimited number of workstations to the cloud-based computer or server.
The cloud could be a complicated structure that involves billions of subnetworks. When the outside or upper-level server sees some IP address there could be a group of networks behind that IP address. The user can use that kind of system as they are one solid monolithic computer.
The cloud doesn't know what kind of information is stored in it. Certain parameters make the cloud detect malicious software. Those parameters can be in a firewall. And they detect the harmful code from data that travels in the network.
The AI-based chatbots use cloud architecture to find answers. Those chatbots are problematic because they don't understand what they say. They just collect information from multiple sources. Then those systems interconnect that data to the answer. And in that process, they just check the grammar before the answer is sent to the user.
Sometimes some network marketers make a redirect page, there is only one world. If the search engine indexes homepages using the number of clicks, those redirect pages can increase their PageRank.
The problem with the Internet and search engine indexes is that the new homepages that are not yet indexed or got clicks are hard to find. Before a certain number of clicks, new homepages do not get their place under the sun. People will not have time or willingness to search even thousands of homepages.
Those redirect pages involve very common search words. That increases their position in the search index. And if those redirect homepages involve a command that relinks the user automatically to a certain homepage, they can raise the PageRank of that homepage. And that can make the AI give the wrong answers.
When the algorithms select the homepages, they could simply calculate how many percent of the words involved in the search are in the homepages. That the AI uses for generating answers.
The AI might have a list of things, like articles and common verbs¨, that help it detect the core elements from queries. If some homepage involves 100 % certain repeating worlds, that makes the AI suspect that the purpose is to raise itself or some other homepage's PageRank. One purpose for that is to raise the PageRank. If the AI uses PageRank to index things that it uses that kind of redirect page can cause catastrophic to answers that the AI gives.
That makes it possible for the algorithm can detect if the homepage is a hoax. The easiest way to do that is simply to detect things like there is only one repeating world on the homepage. And that thing is the easiest way to detect the easiest hoax homepages.
Comments
Post a Comment