This project is a research about online journalism, rumors spreading and fake informations through new media and Internet. With the internet, a lot of fake information are now spread every seconds. The analysis of those rumors make us understand that for the majority, they are based on assumptions, predjudices and mental models.
How can we save and encapsulate fake information from the web as they are viral and then demonstrate that they are often quickly modified? How can we trace them? How can we raise the awareness of people about this topic?
Contextualised Information is a hybrid publication. It proposes a time real archive of hoaxes and rumors spread online and a critic on the question, by comparing 3 different points of view for each headline: the media spreading the news itself, the reactions of people on Twitter, and the analysis of a specialist (sociologist, philosopher, writer).
The rumors are sorted by categories: Political field, Economics and Technology, Miscellaneous and People.
A series of scripts records the data from different web sources, tracked with the realtime hoax debunker emergent.info and puts the data in a chronological order and in the appropriate part of the book with a series of Python scripts.
The layout is computed and automated to make the publication really easy and fast to produce.
The design of the book was realized in HTML and CSS and the book was exported with Prince XML.
In this book, the last 3 months of rumors are encapsulated. Between each of the thematic parts, quotations from texts of famous authors and specialists of rumors are shown.
Three different indexes are generated automatically and allow the reader to navigate the book differently, but also to have a different analysis of the rumors.
The books are exported as a PDF file, which can be downloaded, printed on demand or read on a tablet.
This was made with the help of Mathieu Allard for the use of scripts written in Python.
The project is still in developement. https://github.com/LenaRobin/diplome