jones_supa writes: Phoronix notes how it has been a long time since last hearing of any major innovations or improvements to VirtualBox, the virtual machine software managed by Oracle. This comes while VMware is improving its products on all platforms, and KVM, Xen, Virt-Manager, and related Linux virtualization technologies continue to advance as well. Is there any hope left for a revitalized VirtualBox? It has been said that there are only four paid developers left on the VirtualBox team at the company, which is not enough manpower to significantly advance such a complex piece of software. The v4.3 series has been receiving some maintenance updates during the last two years, but that's about it.
dcblogs writes Evans Data Corp., which provides research and intelligence for the software development industry, said that of the estimated 19 million developers worldwide, 19% are now doing IoT-related work. A year ago, the first year IoT-specific data was collected, that figure was 17%. But when developers were asked whether they plan to work in IoT development over the next year, 44% of the respondents said they are planning to do so, said Michael Rasalan, director of research at Evans.
itwbennett writes Researchers from Drexel University, the University of Maryland, the University of Goettingen, and Princeton have developed a "code stylometry" that uses natural language processing and machine learning to determine the authors of source code based on coding style. To test how well their code stylometry works, the researchers gathered publicly available data from Google's Code Jam, an annual programming competition that attracts a wide range of programmers, from students to professionals to hobbyists. Looking at data from 250 coders over multiple years, averaging 630 lines of code per author their code stylometry achieved 95% accuracy in identifying the author of anonymous code (PDF). Using a dataset with fewer programmers (30) but more lines of code per person (1,900), the identification accuracy rate reached 97%.
Nerval's Lobster writes: What does it take to become a great — or even just a good — software developer? According to developer Michael O. Church's posting on Quora (later posted on LifeHacker), it's a long list: great developers are unafraid to learn on the job, manage their careers aggressively, know the politics of software development (which he refers to as 'CS666'), avoid long days when feasible, and can tell fads from technologies that actually endure... and those are just a few of his points. Over at Salsita Software's corporate blog, meanwhile, CEO and founder Matthew Gertner boils it all down to a single point: experienced programmers and developers know when to slow down. What do you think separates the great developers from the not-so-fantastic ones?
An anonymous reader writes: There has been a furious effort over the past few years to bring the teaching of programming into the core academic curricula. Enthusiasts have been quick to take up the motto: "Coding is the new literacy!" But long-time developer Chris Granger argues that this is not the case: "When we say that coding is the new literacy, we're arguing that wielding a pencil and paper is the old one. Coding, like writing, is a mechanical act. All we've done is upgrade the storage medium. ... Reading and writing gave us external and distributable storage. Coding gives us external and distributable computation. It allows us to offload the thinking we have to do in order to execute some process. To achieve this, it seems like all we need is to show people how to give the computer instructions, but that's teaching people how to put words on the page. We need the equivalent of composition, the skill that allows us to think about how things are computed."
He further suggests that if anything, the "new" literacy should be modeling — the ability to create a representation of a system that can be explored or used. "Defining a system or process requires breaking it down into pieces and defining those, which can then be broken down further. It is a process that helps acknowledge and remove ambiguity and it is the most important aspect of teaching people to model. In breaking parts down we can take something overwhelmingly complex and frame it in terms that we understand and actions we know how to do."
mikejuk writes Bjarne Stroustrup, the creator of C++, is the 2015 recipient of the Senior Dahl-Nygaard Prize, considered the most prestigious prize in object-oriented computer science. Established in 2005 it honors the pioneering work on object-orientation of Ole-Johan Dahl and Kristen Nygaard, who designed Simula, the original object-oriented language and are remembered as "colorful characters." To be eligible for the senior prize an individual must have made a "significant long-term contribution to the field of Object-Orientation," and this year it goes to Bjarne Stoustrup for the design, implementation and evolution of the C++ programming language. You can't argue with that.
DW100 writes In a bizarre public blog post the CEO of BlackBerry, John Chen, has claimed that net neutrality laws should include forcing app developers to make their services available on all operating systems. Chen even goes as far as citing Apple's iMessage tool as a service that should be made available for BlackBerry, because at present the lack of an iMessage BlackBerry app is holding the firm back.
Some excerpts from Chen's plea: Netflix, which has forcefully advocated carrier neutrality, has discriminated against BlackBerry customers by refusing to make its streaming movie service available to them. Many other applications providers similarly offer service only to iPhone and Android users. ... Neutrality must be mandated at the application and content layer if we truly want a free, open and non-discriminatory internet. All wireless broadband customers must have the ability to access any lawful applications and content they choose, and applications/content providers must be prohibited from discriminating based on the customer’s mobile operating system. Since "content providers" are writing code they think makes sense for one reason or another (expected returns financial or psychic), a mandate to write more code seems like a good way to re-learn why contract law frowns on specific performance.
First time accepted submitter thomawack writes As a designer I always do webdesign from scratch and put them into CMSMS. Frameworks are too complicated to work into, their code is usually bloated and adaptable online solutions are/were limited in options. I know my way around html/css, but I am not a programmer. My problem is, always starting from scratch has become too expensive for most customers. I see more and more online adaptive solutions that seem to be more flexible, but I am a bit overwhelmed because there are so many solutions around. Is there something you can recommend?
theodp writes: Coding got a couple of shout-outs from the White House in Tuesday's State of the Union Address. "Thanks to Vice President Biden's great work to update our job training system," said President Obama (YouTube), "we're connecting community colleges with local employers to train workers to fill high-paying jobs like coding, and nursing, and robotics." And among the so-called "boats" in the new "River of Content" that the White House social media folks came up with to enhance the State of the Union is a card intended to be shared on Twitter & Facebook which reads, "Let's teach more Americans to code. (Even the President is learning!)."
President Obama briefly addressed human spaceflight, saying, "I want Americans to win the race for the kinds of discoveries that unleash new jobs – converting sunlight into liquid fuel; creating revolutionary prosthetics, so that a veteran who gave his arms for his country can play catch with his kid; pushing out into the Solar System not just to visit, but to stay." He also called once more for action on climate change. Politifact has an annotated version of the transcript for more background information on Obama's statements, and FiveThirtyEight has a similar cheat sheet.
Nerval's Lobster writes: While some programming languages achieved early success only to fall by the wayside (e.g., Delphi), one language that has quietly gained popularity is D, which now ranks 35 in the most recent Tiobe Index. Inspired by C++, D is a general-purpose systems and applications language that's similar to C and C++ in its syntax; it supports procedural, object-oriented, metaprogramming, concurrent and functional programming. D's syntax is simpler and more readable than C++, mainly because D creator Walter Bright developed several C and C++ compilers and is familiar with the subtleties of both languages. D's advocates argue that the language is well thought-out, avoiding many of the complexities encountered with modern C++ programming. So shouldn't it be more popular?
theodp writes ICT/Computing teacher Ben Gristwood justifies his choice of Visual Basic as a programming language (as a gateway to other languages), sharing an email he sent to a parent who suggested VB was not as 'useful' as Python. "I understand the popularity at the moment of the Python," Gristwood wrote, "however this language is also based on the C language. When it comes to more complex constructs Python cannot do them and I would be forced to rely on C (which is incredibly complex for a junior developer) VB acts as the transition between the two and introduces the concepts without the difficult conventions required. Students in Python are not required to do things such as declare variables, which is something that is required for GCSE and A-Level exams." Since AP Computer Science debuted in 1984, it has transitioned from Pascal to C++ to Java. For the new AP Computer Science Principles course, which will debut in 2016, the College Board is leaving the choice of programming language(s) up to the teachers. So, if it was your call, what would be your choice for the Best Programming Language for High School?
theodp writes Some of the world's leading Data Scientists are on the payrolls of Microsoft, Google, Facebook, Yahoo, and Apple. So, it'd be interesting to get their take on the infographics the tech giants have passed off as diversity data disclosures. Microsoft, for example, reported its workforce is 29% female, which isn't great, but if one takes the trouble to run the numbers on a linked EEO-1 filing snippet (PDF), some things look even worse. For example, only 23.35% of its reported white U.S. employee workforce is female (Microsoft, like Google, footnotes that "Gender data are global, ethnicity data are US only"). And while Google and Facebook blame their companies' lack of diversity on the demographics of U.S. computer science grads, CS grad and nationality breakouts were not provided as part of their diversity disclosures. Also, the EEOC notes that EEO-1 numbers reflect "any individual on the payroll of an employer who is an employee for purposes of the employers withholding of Social Security taxes," further muddying the disclosures of companies relying on imported talent, like H-1B visa dependent Facebook. So, were the diversity disclosure mea culpas less about providing meaningful data for analysis, and more about deflecting criticism and convincing lawmakers there's a need for education and immigration legislation (aka Microsoft's National Talent Strategy) that's in tech's interest?
An anonymous reader writes "Linus Torvalds has sent a lengthy statement to Ars Technica responding to statements he made in a conference in New Zealand. One of his classic comments in NZ was: "I'm not a nice person, and I don't care about you. I care about the technology and the kernel — that's what's important to me." On diversity, he said that "the most important part of open source is that people are allowed to do what they are good at" and "all that stuff is just details and not really important." Now he writes: "What I wanted to say — and clearly must have done very badly — is that one of the great things about open source is exactly the fact that different people are so different", and that "I don't know where you happen to be based, but this 'you have to be nice' seems to be very popular in the US," calling the concept of being nice an "ideology"."
New submitter msubieta writes I have been developing some applications to use in small businesses using Windows and SQL Server. I would like to move on and start doing the same thing in Linux. I have looked at several Frameworks/Databases/Development environments and I really don't know what is the best/simplest/fastest to learn approach. I use VS and C# mostly, although I could easily go back to C++. I found Qt and GTK+ are the most common frameworks, but they seem to lack controls that deal with datasets and stuff (sorry, spoiled by the .net form controls), but I also know that I could use Mono in order to make the jump. I would have no problem on moving to MySQL, as I have done quite a lot of work on that side, and I would like to stick with the traditional client server application, as I find it easier to maintain, and a whole lot more robust when it comes to user interaction (web apps for POS applications don't seem to be the right way to go in my view). Any suggestions/comments/recommendations?
schwit1 writes The EFF launched a new app that will make it easier for people to take action on digital rights issues using their phone. The app allows folks to connect to their action center quickly and easily, using a variety of mobile devices. Sadly, though, they had to leave out Apple devices and the folks who use them. Why? Because they could not agree to the terms in Apple's Developer Agreement and Apple's DRM requirements.
Nerval's Lobster writes There is no shortage of programming languages, from the well-known ones (Java and C++) to the outright esoteric (intended just for research or even humor). While the vast majority of people learn to program the most-popular ones, the lesser-known programming languages can also secure you a good gig in a specific industry. Which languages? Client-server programming with Opa, Salesforce's APEX language, Mathematica and MATLAB, ASN.1, and even MIT's App Inventor 2 all belong on that list, according to developer Jeff Cogswell. On the other hand, none of these languages really have broad adoption; ASN.1 and SMI, for example, are primarily used in telecommunications and network management. So is it really worth taking the time to learn a new, little-used language for anything other than the thrills?
mrspoonsi writes Respected developer Marco Arment is worried about Apple's future. In a blog post, he writes, "Apple's hardware today is amazing — it has never been better. But the software quality has taken such a nosedive in the last few years that I'm deeply concerned for its future." Arment was CTO at Tumblr, before he left to start Instapaper. "Apple has completely lost the functional high ground," says Arment. "'It just works' was never completely true, but I don't think the list of qualifiers and asterisks has ever been longer." He blames Apple prioritizing marketing for the problems with Apple's software. Apple wants to have new software releases each year as a marketing hook, but the annual cycles of updating Apple's software are leading to too many bugs and problems, he says: I suspect the rapid decline of Apple's software is a sign that marketing has a bit too much power at Apple today: the marketing priority of having major new releases every year is clearly impossible for the engineering teams to keep up with while maintaining quality. Maybe it's an engineering problem, but I suspect not — I doubt that any cohesive engineering team could keep up with these demands and maintain significantly higher quality."