Two professors from Wellesley College’s Department of Computer science have been awarded a nearly half million dollar National Science Foundation grant to build an application that gauges the trustworthiness of information shared on social networks, and in particular Twitter.
"Users leave a digital trace behind when they make an announcement," says Eni Mustafaraj a visiting assistant professor who along with Computer Science Professor Panagiotis Metaxas earned attention last year for research into Twitter bombing used to influence voters during the 2010 Massachusetts special congressional election (and there work has been built upon in other states, such as Indiana). "The application will follow those digital traces to determine whether a message sender is reputable, allowing the user to make a determination about whether a message should be trusted."
NETWORK WORLD’S HOTTEST TECH ARGUMENTS: Allow social media vs. Ban social media at work
Determining the trustworthiness of information sources on social networks is becoming more important as more people rely on such information to make financial, medical and other decisions. The researchers had planned initially to focus on spammer identification, but have broadened their effort to help social network users determine whether whatever information they are looking at should be trusted.
Factors going into trust measurement include past history of the originating sender, whether other Twitter users trust it and whether the same info is mysteriously surfacing from separate sources.
The NSF grant will also fund development of an online course to teach undergrads and high school students to think critically about information sources.
TWEETOGRAPHER IS BORN
Separately, a pair of University of Cincinnati computer science students will have to wait for their Twitter payday, but they’ve got a good start by creating a Web-based app called Tweetographer that helps users mine for useful data in Twitter about what’s going on in their area.
Billy Clifton and Alex Padgett’s "Tweetographer," their six-month senior project, is described as a real-time events guide extracted from information coming via large numbers of tweets. Their work included coming up with a queuing system to process the flood of tweets and deciphering shorthand used by tweeters for days of the week, locations and such.
"We wanted to explore data mining, which is an important area of research in Computer Science, in the context of social media," Padgett said, in a statement. "Although the concept will work with many social media platforms, Twitter was the most accessible. Everything is out there in public domain, a giant pool of untapped data, tagged with latitude and longitude. It’s very precise and lends itself to so many uses."
It’s possible that the Tweetographer might become publicly available by year-end as a Web app and mobile app, Clifton says: "We are working on giving the graphical user interface an overhaul, migrating servers, as well as some other maintenance."
Clifton thinks the engine they created could be used for other applications, such as predicting election results and compiling product reviews.
Read more about lans and routers in Network World's LANs & Routers section.