— The Economist (@TheEconomist) March 20, 2015
Around 11 p.m. one night, you realize there’s a key step your team needs to take on a current project. So, you dash off an email to the team members while you’re thinking about it.
No time like the present, right?
Wrong. As a productivity trainer specializing in attention management, I’ve seen over the past decade how after-hours emails speed up corporate cultures — and that, in turn, chips away at creativity, innovation, and true productivity.
If this is a common behavior for you, you’re missing the opportunity to get some distance from work — distance that’s critical to the fresh perspective you need as the leader. And, when the boss is working, the team feels like they should be working.
This  has been a tumultuous year in technology; we have seen security breaches, iPhone 6, Google Glass, drones and the virtual reality of Oculus Rift. In 2015 we will see that mobile devices will continue to become the prevalent way for people to browse the Internet. Highly visited sites such as google.com provide the mobile experience by sensing that the request for a web page originates from a hand-held device, and more sites are going to follow suit. Already, over half of consumer time spent on the Internet is on mobile devices, and hence the prediction of great growth in the mobile web, and, in particular, in mobile shopping.
But mobile devices are not only becoming the window of choice to the Internet; they are in fact becoming the computing platform of choice. For example, the CIO at Hargrove Inc., a trade-show and event services company in Lanham, Maryland, is investing in mobile technologies that will make it easier to access floor plans and information from the trade-show floor. For this to be practical, the screen size of mobile devices has to increase, and indeed it already has. Take, for instance, the seven-inch MediaPad X1 by Huawei, a giant smartphone. Thus, another prediction is that the line between mobiles and tablets will become more blurred and we will have devices that qualify as both smartphones and tablets.
3-D printing will continue leaving the domain of academics and enthusiasts, and it will become more and more of a standard, bringing the manufacturing of small parts to the household. The next generation of at-home 3-D printers will use lasers, heat or liquid to bind powders into solid materials.
Global IT giants including Amazon and LinkedIn could be doing far more to raise awareness of the need for better password practices among their users.
Analysis by Professor Steve Furnell, Director of the Centre for Security, Communications and Network Research at Plymouth University, looked into the password security controls in place among ten of the world’s most visited websites.
It revealed very few of them give detailed guidance about the importance of providing secure passwords, either when users were creating or updating accounts.
The majority also provided little or no information about the reasons why password protection is important, and while some did make suggestions about best practice, very few went on to enforce their own advice.
The biggest threat to the Internet is the fact that it was never really designed. Instead, it evolved in fits and starts, thanks to various protocols that were cobbled together to fulfill the needs of the moment. Few of those protocols were designed with security in mind. Or if they were, they sported no more than was needed to keep out a nosy neighbor, not a malicious attacker.
The result is a welter of aging protocols susceptible to exploit on an Internet scale. Some of the attacks levied against these protocols have been mitigated with fixes, but it’s clear that the protocols themselves need more robust replacements. Here are six Internet protocols that could stand to be replaced sooner rather than later or are (mercifully) on the way out.
I am reading an interesting article, The Internet that Facebook built, by Michael L. Best. I am not a Facebook user, but the following quote in the article from a book by José Marichal is interesting and makes one worried:
José Marichal, in his book Facebook Democracy, defines the architecture of disclosure as Facebook’s purpose-built environment that systematically and in some ways insidiously encourages its users to disclose increasingly personal revelatory data. Facebook invests millions in perfecting this architecture not with degraded voyeuristic interest; it is simply their business model. They capture and commodify a portfolio of these disclosures and sell them to the advertisers.
Facebook’s interests may not be voyeuristic, but they certainly seem to elicit a voyeuristic behavior from the users.
A study and data set from Brown University on computer science faculty at the 50 top U.S. schools yielded several interesting findings. One finding was the Massachusetts Institute of Technology produces the most computer science professors through its Ph.D. programs, while coming in second place was the University of California, Berkeley. Another insight gained from the study came from the categorization of each professor’s field of research as either theory, systems, informatics, or scientific computing. The study found private universities offered the greatest concentration of theory study, while public schools had the least. One explanation for this finding is that public universities focus more on engineering, while private universities are more science-oriented. A third finding of the study was the top field of research in terms of computer science faculty hirings remains computer science theory. However, there has been a sharp trend in the last three years toward hiring professors specializing in systems and informatics, while the hiring of faculty who study computer science theory has diminished since 2011.
Now comes a new kind of challenge to the evidence of a cognitive decline, from a decidedly digital quarter: data mining, based on theories of information processing. In a paper published in Topics in Cognitive Science, a team of linguistic researchers from the University of Tübingen in Germany used advanced learning models to search enormous databases of words and phrases.
Since educated older people generally know more words than younger people, simply by virtue of having been around longer, the experiment simulates what an older brain has to do to retrieve a word. And when the researchers incorporated that difference into the models, the aging “deficits” largely disappeared.
“What shocked me, to be honest, is that for the first half of the time we were doing this project, I totally bought into the idea of age-related cognitive decline in healthy adults,” the lead author, Michael Ramscar, said by email. But the simulations, he added, “fit so well to human data that it slowly forced me to entertain this idea that I didn’t need to invoke decline at all.”