data journalism - Media Helping Media https://mediahelpingmedia.org Free journalism and media strategy training resources Mon, 31 Mar 2025 06:25:49 +0000 en-GB hourly 1 https://mediahelpingmedia.org/wp-content/uploads/2022/01/cropped-MHM_Logo-32x32.jpeg data journalism - Media Helping Media https://mediahelpingmedia.org 32 32 Computer-assisted reporting (CAR) https://mediahelpingmedia.org/advanced/computer-assisted-reporting-car/ Mon, 31 Mar 2025 06:17:34 +0000 https://mediahelpingmedia.org/?p=5518 Computer-Assisted Reporting (CAR) refers to the use of digital tools such as spreadsheets, databases, and basic statistical analysis to interrogate large datasets.

The post Computer-assisted reporting (CAR) first appeared on Media Helping Media.

]]>
Image of journalists carrying out computer-assisted reporting (CAR) image created using Imagen 3 - created by David Brewer of MHMComputer-Assisted Reporting (CAR) refers to the use of digital tools such as spreadsheets, databases, and basic statistical analysis to interrogate large datasets.

Since the development of computers, CAR has been used by journalists to uncover patterns and trends by examining data. Now, CAR has become a subset of the wider area of expertise known as data journalism – which includes coding, automation, and data visualisation for interactive storytelling.

In our article ‘What is data journalism?‘ we refer to CAR in the context of its role in data journalism. But what is CAR? And how does it differ from data journalism.

  • Computer-Assisted Reporting (CAR):
    • Emerged in the late 20th century as journalists began using computers for reporting.
    • Focuses on using databases, spreadsheets, and basic statistical tools to analyse public records, election results, crime reports, etc.
    • Example: A journalist using Excel to analyse government spending records for a piece of investigative journalism.
  • Data Journalism:
    • A broader, more modern evolution of CAR that includes data collection, analysis, and data visualisation.
    • Incorporates coding, automation, and interactive storytelling techniques.
    • Often involves using programming languages (Python, R), web scraping, machine learning, and data visualisation tools (Tableau, D3.js).
    • Example: The New York Times’ interactive COVID-19 tracking dashboards or The Guardian’s data-driven investigative reports.

Differences between CAR and data journalism:

Feature CAR Data journalism
Focus Data analysis for investigative journalism Data-driven storytelling & visualisation
Tools Spreadsheets, databases Programming, APIs, visualisation tools
Approach Analysing structured data Collecting, cleaning, analysing, and visualising data
Evolution 1980s-1990s 2000s-present

In short, CAR is an early form of data journalism. While CAR was about using computers for analysis, data journalism has expanded to include sophisticated digital tools, coding, and visual storytelling techniques.

Related articles

What is data journalism?

Good journalism has always been about data

Data journalism – resources and tools

Data journalism glossary

 

The post Computer-assisted reporting (CAR) first appeared on Media Helping Media.

]]>
Good journalism has always been about data https://mediahelpingmedia.org/advanced/good-journalism-has-always-been-about-data/ Mon, 24 Mar 2025 19:32:20 +0000 https://mediahelpingmedia.org/?p=5422 We are all data journalists, even those who may have never heard of the term before. Data journalism has been around for years, it's just more accessible and useful now.

The post Good journalism has always been about data first appeared on Media Helping Media.

]]>
Image of journalists looking at a computer screen created with Gemini Imagen 3 AI by Media Helping MediaWe are all data journalists, even those who may have never heard of the term before. Data journalism has been around for years, it’s just more accessible and useful now.

I can remember when I first realised I was a data journalist, or at least helping to produce data journalism.

It was in the summer of 1997 when we were getting ready to launch the BBC News website.  (And by the way, I don’t get any marks for being perceptive, because as we point out in the article ‘What is data journalism?‘, all journalists are data journalists, whether they know it or not, so I had been one since the 1960s.)

Anyway, we were looking at how to produce and improve news stories – and all our assumptions belonged in the analogue age.

We were obviously aware that unlike television and radio, online news was not an ephemeral, one-word-at-a-time medium. Users could dwell on text and be directed to other information for valuable context and background.

We wanted to offer rich, instantly-available material that supplemented and enhanced every story.

But to produce that kind of material, we were used to relying on our own and our colleagues’ memories and archives, the BBC’s tape and audio libraries, a newspaper cuttings library and rudimentary newsroom systems that were not connected to the Internet.

In other words, it was a bit haphazard, almost certainly incomplete, relied on a lot of legwork and took ages.

Suddenly, as our tech guru patiently explained to us, we had electronic access to all kinds of valuable material.  He called it “data”.  The penny dropped.

We could automatically link to related stories. We could use search to produce the raw data for time-lines and fact files.  We could pull down stories being written on primitive terminals in the BBC’s  Moscow newsroom and automatically format them as web pages.

We even had a stab at a bit of software that would automatically create a timeline on important, recurring stories. It would search all our sources for, say, unrest in any particular country and produce a list of events.

To make the list usable, we had to instruct it not to put any two items too close together chronologically, unless they were very important, and to exclude items of lesser importance if the list was too long.

It was very ambitious and I cannot remember if we ever got round to implementing this functionality.  If we did, then we almost invented an early version of artificial intelligence.

But now, the real thing is here, and the new capabilities that fascinated and thrilled us in those early years are now easily and freely available to everyone, in much more powerful versions, thanks to the power of large language models, neural networks and immense distributed computing power.

So now, not only are all journalists data journalists, we all have access to immense quantities of priceless data and the tools to make good use of it.  We have listed many of those data tools and resources.

They are wonderful.  But do not forget that in the term “data journalist” the second word is more important than the first.

We should all be thrilled and grateful for the things Artificial Intelligence makes possible, but the most powerful tools are still the human journalist’s instinct, judgement and training.

Bob Eggington


Graphic for a Media Helping Media lesson plan

This text offers a fascinating glimpse into the nascent stages of digital journalism, particularly the moment when the author recognised the inherent data-driven nature of the craft. Let’s expand on this, adding depth, meaning, and perspective:

The ubiquity of data journalism 

The assertion that “we are all data journalists” transcends a mere label. It’s a fundamental recognition of the information age’s defining characteristic: the sheer volume of data surrounding us. Even before the term gained currency, journalists were implicitly engaged in data analysis, sifting through facts, statistics, and records to construct narratives. The shift, as the author articulates, lies in the accessibility and utility of data.

The analogue to digital leap

The author’s recollection of the BBC News website’s launch in 1997 is a powerful illustration of this transition. The limitations of analogue methods – reliance on memory, physical archives, and disconnected systems – highlight the transformative potential of digital data. The “tech guru’s” revelation wasn’t just about accessing “valuable material”; it was about recognising the inherent structure and relationships within information, the ability to connect disparate pieces into a coherent whole.

Beyond automation

The ambitious attempt to create an automated timeline generator speaks to the early recognition of AI’s potential in journalism. The challenges faced – managing chronological proximity and prioritising information – are precisely the problems that modern AI and machine learning algorithms address. This anecdote is more than a historical footnote; it’s a testament to the foresight of those who recognised the need for intelligent data processing.

The democratisation of data and tools

The author rightly points out that the tools that were once the exclusive domain of tech-savvy journalists are now widely accessible. Large language models, neural networks, and distributed computing have democratised data analysis, empowering individuals to explore, interpret, and visualise information in unprecedented ways. This democratisation, however, does not diminish the importance of journalistic ethics and skills.

The enduring significance of the journalist

The emphasis on “journalist” over “data” is crucial. While AI can automate tasks and provide insights, it cannot replace the human element of journalism. The author’s “instinct, judgement and training” remain indispensable. This encompasses:

  • Critical thinking: Evaluating the credibility and relevance of data sources.
  • Contextualisation: Placing data within a broader social, political, and historical framework.
  • Ethical considerations: Recognising and mitigating biases in data and algorithms.
  • Narrative construction: Crafting compelling stories that resonate with audiences.
  • Human empathy: Understanding and conveying the human impact of data-driven insights.
  • Accountability: Holding power to account, even when the power is expressed in data.

The evolving role of the data journalist

The modern data journalist is not merely a data wrangler but a storyteller, an investigator, and a communicator. They must possess a blend of technical skills and journalistic acumen. They must be able to:

  • Extract meaningful insights from complex datasets.
  • Visualise data in a clear and engaging manner.
  • Communicate data-driven findings to diverse audiences.
  • Understand the limitations and biases of data and algorithms.
  • Use data to uncover hidden patterns and trends.

A call for responsible innovation

As AI continues to transform journalism, it is essential to remember that technology is a tool, not a replacement for human intelligence. The focus should be on using AI to enhance journalistic capabilities, not to automate them entirely. The ethical implications of AI in journalism – including issues of bias, transparency, and accountability – must be carefully considered.

In conclusion, the author’s reflections provide a valuable perspective on the evolution of data journalism. The journey from analogue limitations to digital possibilities underscores the transformative power of data. However, the enduring importance of journalistic integrity and human judgment reminds us that technology is only as good as the people who use it.

Related articles

Data journalism – resources and tools

Data journalism glossary

What is data journalism?

 

The post Good journalism has always been about data first appeared on Media Helping Media.

]]>
Data journalism – resources and tools https://mediahelpingmedia.org/advanced/data-journalism-resources-and-tools/ Mon, 24 Mar 2025 15:57:38 +0000 https://mediahelpingmedia.org/?p=5402 We have compiled a list of some of the leading resources and tools that are available for those starting out in data journalism.

The post Data journalism – resources and tools first appeared on Media Helping Media.

]]>
Image of journalists accessing data created with Gemini Imagen 3 AI by Media Helping Media
Image of journalists accessing data created with Gemini Imagen 3 AI by Media Helping Media

We have compiled a list of some of the leading resources and tools that are available for those starting out in data journalism.

This list will be updated over time. You might want to consult our Data journalism glossary to look up some of the terms that appear below.

Tools;

Below is a list of tools used by data journalists. They cover data gathering, cleaning, analysis, and visualisation. These tools are great for both beginners and experienced data journalists:

Data collection & scraping tools

Data cleaning & preparation

Data visualisation tools

Mapping tools

Data analysis & statistics tools

Fact-checking & verification tools

Other handy tools

Tools for specialist reporters and correspondents

Considerations for using free tools:

  • Data privacy: Be mindful of data privacy when using free tools, especially when working with sensitive information.
  • Learning curves: Some powerful free tools might have a steeper learning curve than paid alternatives.
  • Community support: Look for tools with active communities, as this can provide valuable support and resources.

By combining these free resources, you can build a strong foundation in data journalism without breaking the bank.

Websites:

Related articles

Good journalism has always been about data

Data journalism glossary

What is data journalism?

 

The post Data journalism – resources and tools first appeared on Media Helping Media.

]]>
Data journalism glossary https://mediahelpingmedia.org/advanced/data-journalism-glossary/ Mon, 24 Mar 2025 12:02:14 +0000 https://mediahelpingmedia.org/?p=5385 The following words and terms are commonly used in data journalism. Data journalists might want to familiarise themselves with them.

The post Data journalism glossary first appeared on Media Helping Media.

]]>
Image of a network interface card created with Gemini Imagen 3 AI by Media Helping MediaThe following words and terms are commonly used in data journalism. Data journalists might want to familiarise themselves with them.

Often used words and phrases

  • Algorithm:
    • A set of rules or instructions that a computer follows to solve a problem or perform a task. In data journalism, algorithms can be used for various purposes. Link: Algorithm
  • API (Application Programming Interface):
    • A digital tool that lets you pull data directly from a website or database, often used by journalists to access updated datasets. Link: API
  • Choropleth map:
    • A map shaded in different colours to show how a number or rate changes by area (e.g., COVID-19 cases by county). Link: Choropleth map
  • Computational thinking:
    • The process of breaking down complex problems into smaller, manageable parts, and then creating algorithms to solve them. Link: Computational thinking
  • Correlation:
    • A relationship between two variables (note: correlation doesn’t mean causation). Link: Correlation
  • CSV (Comma-Separated Values):
    • A common, simple file format for datasets which is basically a spreadsheet saved as plain text. Link: CSV
  • Data analysis:
    • Examining data to identify trends, patterns, and relationships. Link: Data analysis
  • Data bias:
    • When data is skewed or incomplete journalists need to be alert to this to avoid misleading the audience. Link: Data bias
  • Data cleansing (or wrangling):
    • The process of fixing messy data in order to correct errors, fill in missing info, and format it so it’s ready for analysis. Link: Data cleansing
  • Data ethics:
    • Principles and guidelines for the responsible collection, analysis, and dissemination of data, with a focus on privacy, security, and fairness. Link: Data ethics
  • Data journalism:
    • The practice of using data to find, create, and tell news stories. It involves collecting, analysing, and visualising data to inform the public. Link: Data journalism
  • Data leak (or breach):
    • When private or sensitive data is released, intentionally or accidentally, newsrooms often investigate these. Link: Data leak or breach
  • Data literacy:
    • The ability to understand, interpret, and communicate data effectively. This includes critical thinking, statistical reasoning, and the ability to identify biases. Link: Data literacy
  • Data mining:
    • The process of extracting valuable information and patterns from large datasets. Link: Data mining
  • Data scraping:
    • Data scraping is the automated process of extracting data from websites or other sources and saving it into a structured format. Link: Data scraping
  • Data transparency:
    • Being open about how the data was handled, what assumptions were made, and what might be missing.
  • Data visualisation:
    • Representing data visually through charts, graphs, maps, and other graphical formats. Link: Data visualisation
  • Dataset:
    • Or data-set is a collection of related data, like a spreadsheet or table, often the starting point for a data story. Link: Dataset
  • Deduplication:
    • Removing repeated entries in a dataset to avoid counting the same thing twice. Link: Data deduplication
  • Descriptive statistics:
    • Simple summaries of data, such as averages, medians, and percentages, that help explain your findings. Link: Descriptive statistics
  • FOIA (Freedom of Information Act) Request:
  • Geospatial data:
    • Data that includes location information which is essential for making maps or analysing patterns by area. Link: Geospatial data
  • Heat map:
    • A graphic that uses colour intensity to show concentrations of activity or numbers. Link: Heat map
  • Interactive graphics:
    • Visuals that let readers explore data such as maps you can zoom in on or filters to compare regions.
  • Interactive visualisation:
  • JSON (JavaScript Object Notation):
    • A format often used by websites and APIs to structure data. Journalists may need to convert this into tables. Link: JSON
  • Machine learning:
    • Computer systems analysing data to find patterns. Used in investigative journalism for things like identifying fake accounts. Link: Machine learning
  • Margin of error:
    • A measure of how much uncertainty there is in survey results. This is particularly important when reporting on political opinion polls. Link: Margin of error
  • Natural Language Processing (NLP):
    • A way to automatically analyse large amounts of text such as searching through thousands of documents for themes. Link: NLP
  • Normalisation:
    • Adjusting numbers to make fair comparisons such as calculating rates per 100,000 people instead of raw numbers. Link: Normalisation
  • Open data:
    • Data published by governments, organisations, or researchers that’s free for anyone to use in their reporting. Link: Open data
  • Outlier:
    • A data point that sticks out because it’s much higher or lower than the rest. Sometimes these lead to important news stories. Link: Outlier
  • Parsing:
    • Breaking down complex information (such as addresses or dates) into standardised parts for easier analysis. Link: Parsing
  • Regression analysis:
    • A more advanced statistical method to explore relationships between variables. This is sometimes used in deep journalistic investigations. Link: Regression analysis
  • Sampling bias:
    • This exists when the group surveyed or studied doesn’t represent the larger population. This can distort results and conclusions. Link: Sampling bias
  • SQL (Structured Query Language):
    • A coding language for searching through large databases. This is helpful for investigative journalism projects. Link: SQL
  • Spreadsheet:
    • A basic tool such as Excel or Google Sheets that most journalists use to store, sort, and analyse data. Link: Spreadsheet
  • Statistical analysis:
    • Using statistical methods to analyse data, including things such as finding the mean, median, and mode, and also finding standard deviations. Link: Statistical and data analysis
  • Structured data:
    • Data organised in rows and columns (such as Excel spreadsheets) that’s easy to sort and analyse. Link: Structured data analysis
  • Time series data:
    • Data collected over time. This is useful for spotting trends, such as changes in crime rates or housing prices. Link: Time series database
  • Tooltip:
    • A small pop-up box in a graphic that appears when readers hover over a data point to reveal details. Link: Tooltip
  • Unstructured data:
    • Data that doesn’t come in neat tables, such as PDFs, social media posts, or interview transcripts. Link: Unstructured data
  • Web scraping:
    • The process of automatically extracting data from websites. Link: Web scrapin

Related articles

Data journalism – resources and tools

What is data journalism?

Good journalism has always been about data

 

The post Data journalism glossary first appeared on Media Helping Media.

]]>
What is data journalism? https://mediahelpingmedia.org/advanced/what-is-data-journalism/ Mon, 24 Mar 2025 07:51:21 +0000 https://mediahelpingmedia.org/?p=5375 Data journalism, also known as data-driven journalism, is the process of finding, understanding, and processing information in order to produce news stories.

The post What is data journalism? first appeared on Media Helping Media.

]]>
Image of a journalist analysing data created with Gemini Imagen 3 AI by Media Helping Media
Image of a journalist analysing data created with Gemini Imagen 3 AI by Media Helping Media

Data journalism, also known as data-driven journalism, is the process of finding, understanding, and processing information in order to produce news stories.

It’s always been part of the news production workflow but has increased in importance since the development of computers and the internet.

In the past journalists used to analyse numbers by hand trying to make sense of what they had jotted down in their notebooks when out covering a story.

By just asking the basic journalistic questions of what, why, when, how, where, and who, journalists were gathering data. This would result in collecting important data such as:

  • What has happened?
    • Event type and frequency: Crash, fire, riot – is it this the first time, the 10th time – how many times?
  • How many people were affected?
    • Number of people killed or injured, ambulances, police deployed
  • When did this happen?
    • Time and date, rush hour, drive time, overnight, morning.
  • Where did it happen?
    • Location – street, town, intersection, map reference, accident blackspot, area of known tension perhaps
  • Who could have more information?
    • Local authority or police records, facts and figures regarding similar events in the past.

In the example above the reporter would have jotted down any information they could find about the story they were covering. Those notes contained data which would be an essential part in telling the story.

That data, if processed and then analysed, could help the journalist and their team dig much deeper. But there was limited access to that data.

It would be contained in the reporter’s notebook, in the next edition of the newspaper, or broadcast in the next news bulletin, and stored in a newsroom archive as a physical cutting – but it would be hard to retrieve or be of much further use. (See – The importance of keeping records)

Perhaps a diligent journalist, who was specialising in a particular area, or working on an investigation, would create a simple hand-drawn spreadsheet to try to crunch the numbers, but often they were soon sent off to cover the next story and the data they had gathered would be put to one side.

Then came computers. This enabled journalists to store data and make sense of it using spreadsheets to look for patterns in terms of frequency, size, time, and any relationships between events.

With the development of the internet it became easier to find and share large amounts of data. Computers could be used to connect the data in ways that would have been impossible for a journalist in the past.

This resulted in computer assisted reporting (CAR) which uses technology to analyse data and helps journalists find hidden stories and investigate complex issues such fraud and corruption.

By examining large datasets – structured collections of related data revealing patterns, trends, and relationships –  journalists are able to produce more accurate and impactful journalism.

Computers also enable journalists to display the data they had gathered in graphs, charts, and maps – this is called data visualisation – which means that complex datasets can be displayed in easy to understand ways.

Data journalism is now an important part of news production with many journalists using advanced tools to find complex stories. And they are able to share their data so everyone can see where the information came from. This also leads to collaboration between different teams of journalists working together on a complex and important investigation.

In summary, data journalism has progressed from being a specialist practice, to an integral part of modern news reporting in several ways:

  • Data analysis: Collecting, organising, and examining large amounts of data to uncover trends, patterns, and news angles.
  • Storytelling: Using the insights uncovered to create compelling and informative news stories, and presenting complex information in a clear and easy to understand way.
  • Visualisation: Creating charts, graphs, and maps to help audiences understand the stories behind the data.
  • Tools: The use of spreadsheets, statistical software, and data visualisation platforms to process data in order to make it more useful in the news production process.
  • Evidence: By including reliable and rich data in stories, data journalism can provide a more objective and evidence-based approach to reporting.
  • Quantity: Data journalism enables a journalist to sift through large amounts of data – such as survey results, financial figures, football results, and government records to find stories hidden within that data.
  • Accessibility: The journalist can then present those stories in a clear and easy-to-understand way using charts and graphs.
  • Reliability: Instead of just relying on someone’s opinion, as has often been the case in the past, the journalist can use facts and figures to back up their reporting.

Graphic for a Media Helping Media lesson plan

Data journalism – further thoughts

Journalism has always been a pursuit of truth, sifting through the noise to reveal what matters. At its core lies the fundamental task of gathering, analysing, and presenting information in ways that help society make sense of the world.

Over time, the methods used by journalists have evolved, but one constant remains: data has always been central to storytelling, whether jotted in a notebook or embedded within sprawling digital databases.

What has changed dramatically is the scale, speed, and sophistication with which journalists can access and interrogate information. The digital age has transformed raw data from fragmented observations into powerful tools for accountability, insight, and public understanding.

Where once reporters might have tallied casualty figures by hand or kept mental notes on patterns they noticed over time, they now wield vast datasets – crime records, health statistics, financial disclosures, social media activity – as both sources and subjects of their investigations.

The shift is not merely technological but philosophical. Data-driven journalism reframes the journalist’s role. They are no longer just a chronicler of events, they are also an investigator uncovering patterns invisible to the naked eye.

A single incident becomes part of a larger puzzle: a crash is not just an accident but potentially a symptom of systemic infrastructure failures; a spike in evictions reveals deeper housing inequities; electoral results expose demographic shifts and political realignments.

Data breathes life into these stories, adding context, nuance, and evidence that deepens public understanding.

With computational tools, journalists move beyond surface narratives to probe the why and how, not just the what. Algorithms, spreadsheets, and statistical models allow them to test hypotheses, verify claims, and uncover hidden relationships.

This capability becomes crucial in an era where misinformation spreads fast, and complex issues, such as climate change, global pandemics, economic inequality, demand rigorous scrutiny.

Equally transformative is the way data enables storytelling. Visualisations such as maps, charts, interactive graphics, help translate complexity into clarity. They allow audiences to see the scale of a crisis, the trajectory of a trend, or the impact of policy decisions in ways that words alone cannot achieve.

Good data visualisation doesn’t just display numbers; it creates an emotional and intellectual connection, turning abstract figures into human stories.

Another profound shift is the collaborative nature of modern data journalism. No longer confined to individual reporters. Many of the most impactful investigations today involve teams of journalists, data scientists, designers, and programmers working together across borders.

Global projects such as the Panama Papers or investigations into environmental destruction exemplify the power of shared datasets and collaborative analysis. Transparency in these projects – publishing methodologies, sharing datasets – also strengthens trust in journalism at a time when skepticism is high.

Ultimately, data journalism enriches the very purpose of the media: to inform, to explain, and to hold power to account. By grounding stories in verifiable evidence, it elevates reporting from anecdote to analysis, offering audiences not just opinions but actionable insights.

As data becomes ever more abundant, the journalist’s challenge is to remain not just a transmitter of information, but a skilled interpreter – someone who can connect the dots, surface the hidden stories, and empower the public to see the world more clearly.

Data is no longer a byproduct of reporting; it is a fundamental driver of journalism’s future.

Graphic for the Q&As on MHM training modulesQuestions and Answers

  1. Question: What is data journalism, and how has its importance changed over time?
    • Answer: Data journalism, also known as data-driven journalism, is the process of finding, understanding, and processing information to produce news stories. While it has always been a part of news production, its importance has significantly increased with the development of computers and the internet, allowing for more efficient and in-depth analysis of large datasets.
  2. Question: How did journalists gather and analyse data before the widespread use of computers?
    • Answer: Before computers, journalists gathered data by hand, jotting down notes in notebooks and attempting to analyse them manually. They used basic journalistic questions such as “what,” “why,” “when,” “how,” “where,” and “who” to collect information. Sometimes, diligent journalists would create hand-drawn spreadsheets for simple analysis, but this was often time-consuming and limited.
  3. Question: What is Computer Assisted Reporting (CAR), and how has it transformed journalism?
    • Answer: Computer Assisted Reporting (CAR) uses technology to analyse data, helping journalists uncover hidden stories and investigate complex issues like fraud and corruption. By examining large datasets, journalists can identify patterns, trends, and relationships that would be impossible to see manually.
  4. Question: What is data visualisation, and why is it important in data journalism?
    • Answer: Data visualisation involves displaying gathered data in graphs, charts, and maps. It’s important because it allows journalists to present complex datasets in an easy-to-understand way, making it accessible to a wider audience and enhancing the impact of their stories.
  5. Question: How does data journalism contribute to a more objective and evidence-based approach to reporting?
    • Answer: By including reliable and rich data in stories, data journalism provides a more objective and evidence-based approach to reporting. It allows journalists to back up their reporting with facts and figures, rather than relying solely on opinions.
  6. Question: How has the role of a journalist evolved with the rise of data journalism?
    • Answer: The role of a journalist has evolved from simply chronicling events to also becoming an investigator who uncovers patterns and relationships within data. They now use tools to analyse large datasets, test hypotheses, and verify claims, providing deeper insights and accountability.
  7. Question: What are some examples of tools used in data journalism?
    • Answer: Tools used in data journalism include spreadsheets, statistical software, and data visualisation platforms. These tools help journalists process and analyse large datasets, making the information more useful for news production.
  8. Question: How does data journalism enhance storytelling?
    • Answer: Data journalism enhances storytelling by providing context, nuance, and evidence that deepens public understanding. Visualisations such as maps and charts help translate complex data into clear and impactful narratives.
  9. Question: How has collaboration changed in modern data journalism, and why is it important?
    • Answer: Modern data journalism involves increased collaboration among journalists, data scientists, designers, and programmers, often across borders. This collaboration is crucial for tackling complex investigations and sharing datasets, strengthening trust through transparency.
  10. Question: What is the significance of data transparency in data journalism?
    • Answer: Data transparency, such as publishing methodologies and sharing datasets, strengthens trust in journalism, especially in times of skepticism. It allows the audience to see where the information came from and verify the findings, promoting accountability and credibility.

Related articles

Data journalism – resources and tools

Data journalism glossary

Good journalism has always been about data


The post What is data journalism? first appeared on Media Helping Media.

]]>