You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Mar 11, 2021. It is now read-only.
Problem Statement 3: "Bias is learned and perpetuated in different ways (e.g. societal beliefs, misrepresentation, ignorance) that consequently create inequitable outcomes across all spheres of life."
Idea:
Check (draft) news content for statements that could be inherently biased or could cause bias in the public opinion. This could be used by people producing media content for news sites. Use some sort of NLP to scan content. If an article contains a statement that could be taken as fact or where racial bias may be present, the application would highlight it and suggest some additional resources to check and perhaps prompt to add in some statistical context to verify this has been done.
It could also be used by people reading the content (perhaps through a browser plug-in) in order to highlight areas of an article that may reinforce racism.
Uniqueness:
Something like this probably exists already (not sure) but potentially not for racial bias.
Impact if implemented:
Prompt creators of news content to be thorough in their analysis of whether the content they are producing is factually accurate and make them aware of the racial bias that is present in their writing.
Reduce reinforced racism from biased media content.
Theme:
Problem Statement 3: "Bias is learned and perpetuated in different ways (e.g. societal beliefs, misrepresentation, ignorance) that consequently create inequitable outcomes across all spheres of life."
Idea:
Check (draft) news content for statements that could be inherently biased or could cause bias in the public opinion. This could be used by people producing media content for news sites. Use some sort of NLP to scan content. If an article contains a statement that could be taken as fact or where racial bias may be present, the application would highlight it and suggest some additional resources to check and perhaps prompt to add in some statistical context to verify this has been done.
It could also be used by people reading the content (perhaps through a browser plug-in) in order to highlight areas of an article that may reinforce racism.
Uniqueness:
Something like this probably exists already (not sure) but potentially not for racial bias.
Impact if implemented:
Prompt creators of news content to be thorough in their analysis of whether the content they are producing is factually accurate and make them aware of the racial bias that is present in their writing.
Reduce reinforced racism from biased media content.