I've been spending more and more time lately trying to keep up with the news. Quite frankly, it's overwhelming, and probably unhealthy—spending 2-3 hours a day reading about Donald Trump is enough to drive anyone crazy.
I think there are a few reasons for this: I just like reading, I'm afraid of missing something "important" (which my friends/coworkers/etc. will catch), and I'm afraid of missing something actionable. I don't want to find myself in a position where something Really Bad(tm) happened, and I didn't do everything I could to prevent it.
That said, I also need to make sure I take care of myself, and that means spending time on things that aren't politics. It means having my own hobbies and projects at home, spending time with friends, exercising, eating, sleeping, and generally relaxing.
So how do I balance my fear of missing out with taking care of myself?
The first and most important thing I need to do is limit my time in front of a news reader. I can make the choice that all of the aforementioned self-care tasks are more important than keeping abreast of everything that's going on.
I can also change how and when I check news. I can set guidelines like the following:
- Always check my RSS reader before Twitter.
- Sort articles newest-first so I'm starting with the most up-to-date information.
- When I've spent "enough" time catching up, mark as read everything I didn't get to.
But I'm also thinking about how to best use the time I do spend on news, and the reality is, I'm spending a bunch of time just sifting through headlines looking for things that are relevant. I want to cast a wide net; I have probably 20 newsfeeds that I'm following (not counting "fun stuff" like xkcd), and that means a lot of headlines. Maybe 1 out of every 30-40 headlines actually holds my interest, which is a pretty low signal-to-noise ratio.
So how do I reduce the noise? Can I still cast a wide net and see only the things that are most relevant across all my chosen sources?
By now you're probably thinking, "No, Des! This isn't a software problem!" And you're right, I'm not super keen on letting software decide what I do and don't see at any given moment, at least not without a clearly-defined, easy-to-understand set of rules governing that decision. But I do have to wonder if there's some socially-responsible way to do algorithmic filtering.
What kind of tradeoffs would be necessary? Sources notwithstanding, could we even reach something that approximates "unbiased" and "fact-based" (or at least, not consistently biased in any particular direction)? Can we avoid the pitfall of, "this is popular, therefore it's right"?
I'm not sure much of this is possible without human intervention (and probably isn't possible even with human intervention). But it would be interesting to try.