News & Events

Menczer Creates Web Search Model

Internet search engines regularly use information about the text contained in pages and the links between pages to return relevant search results because the approach works reasonably well, but less is known about why these relationships exist. A researcher from the University of Iowa has expanded the utility of using text and links in search engines with a mathematical model that divides a large network like the Internet into small local Webs. A Web crawler designed to completely traverse a small Web will provide more comprehensive coverage of a topic than typical search engines, according to FILIPPO MENCZER, an assistant professor of management sciences at the University of Iowa. "My result shows that it is possible to design efficient Web crawling algorithms -- crawlers that can quickly locate any related page among the billions of unrelated pages in the Web," he said.


Return to top of page