Health knowledge made personal
Join this community!
› Share page:
Go
Search posts:

The Evolution of Learning, Knowing, & Finding in the Digital Age

Posted Jul 03 2012 1:48pm

Knowledge, information, and intellect are fuzzy concepts. Knowledge may involve the ability to recall specific pieces of information. But, does knowing lead to intellect? The more information the better? And, what information is needed for intellect? Interesting questions, but definitely beyond my philosophical capabilities. Without a doubt these concepts have evolved in the digital age. An interesting piece entitled Connectivism: A Learning Theory for the Digital Age  is worth a read.

In the past, there was an advantage (likely even an incentive) to “knowing” information, because “finding” information was slow, cumbersome, and time consuming. Think about performing a literature review prior to the internet. It was likely harder (both effort and time wise) to find facts, ideas, and concepts. Potentially, this may have lead to slower, more deliberate processing in the form of in-depth analysis and more critical thinking with reflection, analysis, and connecting to ensure strong knowledge recall.

With the advent of new technologies, and the ever increasing speed and ease of information transfer, the paradigm may have flipped. With the proliferation of the internet and search tools, finding information continued to become easier and faster (this does not address or speak to accuracy, validity, or utility of course). Taking the time to truly know, relate, and connect content was effectively de-incentivized as finding it became convenient beyond belief. Even Einstein was quoted as saying “It’s not what you know, it’s knowing where to find it.” For some information and procedures, this is absolutely true. Atul Gawande addressed this very concept in the book   The Checklist Manifesto (which is fantastic! check out this video summary ).

But, do the manifestations of this paradigm shift have the potential to be devastating for students and learners, including clinicians, of all types? The incentive for laziness is present. Google search, “the abstract says…”, “so & so tweeted this.” One must consciously recognize the potential traps, and work hard to critically appraise, connect, reflect, and relate to information.

The same is true of evidence based practice. “Well, this article conclusion states X is good for Y.” “The systematic review recommends X for Y.” Now, I am not advocating against evidence based practice, just pointing out a potentially devastating short cut or pit fall. Without a conscious and attentive adherence to prior plausibility, principles of science, and critical thinking, we are all likely to fall victim to “citing the evidence” in this regard. Now, this really is a different topic, for a different time…

With the advent of Web 2.0 and social media technology information is pushed directly to you. For better or for worse, masters of technology and social media with large followings or broad connections have the power to proliferate ideas to large numbers of people, many of whom did not even seek this information. The term “ viral ” captures this concept accurately, as ideas or internet memes exhibit virus like tendencies. But, even small time social media users can have significant impact if the information they push is deemed useful by those that encounter it, and thus, pushed onward. And, viral growth is born.

The evolution of this technology may prove to be profoundly beneficial if utilized appropriately. People will encounter information in the form of Facebook status updates, tweet thoughts, blog posts, research articles, and news they did not even seek. Technology and social media including blogs, can be leveraged to not only encounter new information (most of which is not purposefully sought after), but to engage, connect, critique and more deeply understand. Both the author and the reader can benefit, as social media now allows the reader, or consumer, to engage via comments and replies. Learners armed with the power of new technology and the cognitive skills to appropriately use it can make a major impact.

In the future, I foresee the potential of these new technologies and paradigms fundamentally changing not just education, but the face of formal science and publishing. Jason Silvernail and I have discussed this before when discussing if industry standards were serving researchers, clinicians, and science . Building on that topic, Diane Jacobs at SomaSimple , recently posted a link to blog post Why Academic Papers are a Horrible Discussion Forum . These insights set the stage for how new technology and social media can be tools of meaningful change in the future of learning, knowing, finding, discussing, and learning.

This anonymous quote summarizes it best

Education means developing the mind, not stuffing the memory

Unfortunately, our education system at all levels seems on the cusp of  failing in this regard. Some of these technology tools, if not utilized appropriate, may have the potential to exacerbate the problem. But, as we have witnessed, technology has the potential to make big changes, for the better.

Post a comment
Write a comment:

Related Searches