We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!
spadadot writes French-based startup has just launched a website that will let you add your skills to a comprehensive map of human skills. As quoted from their website "We aim to build the largest, most accurate, multilingual skills database ever made, by allowing a diverse and skillful community to contribute their individual skills to the global map." The ontology is simple: skills can have zero or more sub-skills. Every new skill is available in all supported languages (only English and French at the moment). The crowdsourced data is free for non-commercial use."
70 comments | about two weeks ago
Sez Zero writes with news about the latest from Amazon Web Services. "Once again Amazon Web Services is taking on Oracle, the kingpin of relational databases, with Aurora, a relational database that is as capable as 'proprietary database engines at 1/10 the cost,' according to AWS SVP Andy Jassy. Amazon is right that customers, even big Oracle customers who hesitate to dump tried-and-true database technology are sick of Oracle’s cost structure and refusal to budge from older licensing models. Still there are very few applications that are more “sticky” than databases, which after typically contains the keys to the kingdom. Financial institutions see their use of Oracle databases as almost a pre-requisite for compliance, although that perception may be changing."
102 comments | about two weeks ago
schwit1 writes U.S. District Judge Tanya Chutkan said the bureau's Next Generation Identification program represents a "significant public interest" due to concerns regarding its potential impact on privacy rights and should be subject to rigorous transparency oversight. "There can be little dispute that the general public has a genuine, tangible interest in a system designed to store and manipulate significant quantities of its own biometric data, particularly given the great numbers of people from whom such data will be gathered," Chutkan wrote in an opinion.
79 comments | about two weeks ago
New submitter puzzled_decoy writes The company I work has decided to get in on this "big data" thing. We are trying to find a good data warehouse system to host and run analytics on, you guessed it, a bunch of data. Right now we are looking into MSSQL, a company called Domo, and Oracle contacted us. Google BigQuery may be another option. At its core, we need to be able to query huge amounts of data in sometimes rather odd ways. We need a strong ETLlayer, and hopefully we can put some nice visual reporting service on top of wherever the data is stored. So, what is your experience with "big data" servers and services? What would you recommend, and what are the pitfalls you've encountered?
147 comments | about two weeks ago
Lasrick writes: What a great idea: the Old Weather Project uses old logbooks to study the weather patterns of long ago, providing a trove of archival data to scientists who are trying to fill in the details of our knowledge about the atmosphere and the changing climate. "Pity the poor navigator who fell asleep on watch and failed to update his ship's logbook every four hours with details about its geographic position, time, date, wind direction, barometric readings, temperatures, ocean currents, and weather conditions." As Clive Wilkinson of the UK's National Maritime Museum adds, "Anything you read in a logbook, you can be sure that it is a true and faithful account."
The Old Weather Project uses citizen scientists to transcribe and digitize observations that were scrupulously recorded on a clockwork-like basis, and it is one of several that climate scientists are using to create "a three-dimensional computer simulation that will provide a continuous, century-and-a-half-long profile of the entire planet's climate over time" — the 20th Century Reanalysis Project. Data is checked and rechecked by three different people before entry into the database, and the logbook measurements are especially valuable because they were compiled at sea.
102 comments | about two weeks ago
hazeii writes Though legal proceedings following the Snowden revelations, Liberty UK have succeeded in forcing GCHQ to reveal secret internal policies allowing Britain's intelligence services to receive unlimited bulk intelligence from the NSA and other foreign agencies and to keep this data on a massive searchable databases, all without a warrant. Apparently, British intelligence agencies can "trawl through foreign intelligence material without meaningful restrictions", and can keep copies of both content and metadata for up to two years. There is also mention of data obtained "through US corporate partnerships". According to Liberty, this raises serious doubts about oversight of the UK Intelligence and Security Committee and their reassurances that in every case where GCHQ sought information from the US, a warrant for interception signed by a minister was in place.
Eric King, Deputy Director of Privacy international, said: "We now know that data from any call, internet search, or website you visited over the past two years could be stored in GCHQ's database and analyzed at will, all without a warrant to collect it in the first place. It is outrageous that the Government thinks mass surveillance, justified by secret 'arrangements' that allow for vast and unrestrained receipt and analysis of foreign intelligence material is lawful. This is completely unacceptable, and makes clear how little transparency and accountability exists within the British intelligence community."
95 comments | about three weeks ago
lkcl writes: In an open letter to the core developers behind OpenLDAP (Howard Chu) and Python-LMDB (David Wilson) is a story of a successful creation of a high-performance task scheduling engine written (perplexingly) in Python. With only partial optimization allowing tasks to be executed in parallel at a phenomenal rate of 240,000 per second, the choice to use Python-LMDB for the per-task database store based on its benchmarks, as well as its well-researched design criteria, turned out to be the right decision. Part of the success was also due to earlier architectural advice gratefully received here on Slashdot. What is puzzling, though, is that LMDB on Wikipedia is being constantly deleted, despite its "notability" by way of being used in a seriously-long list of prominent software libre projects, which has been, in part, motivated by the Oracle-driven BerkeleyDB license change. It would appear that the original complaint about notability came from an Oracle employee as well.
98 comments | about a month ago
An anonymous reader writes Drupal has patched a critical SQL injection vulnerability in version 7.x of the content management system that can allow arbitrary code execution. The flaw lies in an API that is specifically designed to help prevent against SQL injection attacks. "Drupal 7 includes a database abstraction API to ensure that queries executed against the database are sanitized to prevent SQL injection attacks," the Drupal advisory says. "A vulnerability in this API allows an attacker to send specially crafted requests resulting in arbitrary SQL execution. Depending on the content of the requests this can lead to privilege escalation, arbitrary PHP execution, or other attacks."
54 comments | about a month ago
jfruh writes: It used to be that you could get an Oracle database certification and declare yourself Oracle-certified for the rest of your career. That time is now over, causing a certain amount of consternation among DBAs. On the one hand, it makes sense that someone who's only been certified on a decade-old version of the product should need to prove they've updated their skills. On the other, Oracle charges for certification and will definitely profit from this shift."
108 comments | about a month ago
90 comments | about a month and a half ago
minstrelmike writes If we give up all our privacy on-line for contextual ads, then how come so many of them are so far off the mark? Personal data harvesting for contextual ads and content should be a beautiful thing. They do it privately and securely, and it's all automated so that no human being actually learns anything about you. And then the online world becomes customized, just for you. The real problem with this scenario is that is we're paying for contextual ads and content with our personal data, but we're not getting what we pay for. Facebook advertising is off target and almost completely irrelevant. The question is: Why? Facebook has a database of our explicitly stated interests, which many users fill out voluntarily. Facebook sees what we post about. It knows who we interact with. It counts our likes, monitors our comments and even follows us around the Web. Yet, while the degree of personal data collection is extreme, the advertising seems totally random.
249 comments | about a month and a half ago
Almost a year ago you had a chance to ask professor Kevin Fu about medical device security. A number of events (including the collapse of his house) conspired to delay the answering of those questions. Professor Fu has finally found respite from calamity, coincidentally at a time when the FDA has issued guidance on the security of medical devices. Below you'll find his answers to your old but not forgotten questions.
21 comments | about a month and a half ago
cold fjord sends word about what the Dubai police plan on doing with their Google Glass. Police officers in Dubai will soon be able to identify suspects wanted for crimes just by looking at them. Using Google Glass and a custom-developed facial recognition software, Dubai police will be able to capture photos of people around them and search their faces in a database of people wanted for crimes ... When a match is made in the database, the Glass device will receive a notification. .... What's particularly interesting about the project is that facial recognition technology is banned by the Google Glass developer policy. ... The section of the policy that addresses such technology seems to disqualify the Dubai police force's plan for Glass."
122 comments | about a month and a half ago
alphadogg writes Computer scientists who made breakthroughs in areas such as software architectures and database management systems were among those named National Medal of Technology and Innovation winners by President Barack Obama. These awards, along with the National Medal of Science, are the nation's highest honors for achievement and leadership in advancing the fields of science and technology. Overall, 18 medalists were named.
53 comments | about a month and a half ago
First time accepted submitter Mike Sheen writes I'm the lead developer for an Australian ERP software outfit. For the last 10 years or so we've been using Bugzilla as our issue tracking system. I made this publicly available to the degree than anyone could search and view bugs. Our software is designed to be extensible and as such we have a number of 3rd party developers making customization and integrating with our core product.
We've been pumping out builds and publishing them as "Development Stream (Experimental / Unstable" and "Release Stream (Stable)", and this is visible on our support site to all. We had been also providing a link next to each build with the text showing the number of bugs fixed and the number of enhancements introduced, and the URL would take them to the Bugzilla list of issues for that milestone which were of type bug or enhancement.
This had been appreciated by our support and developer community, as they can readily see what issues are addressed and what new features have been introduced. Prior to us exposing our Bugzilla database publicly we produced a sanitized list of changes — which was time consuming to produce and I decided was unnecessary given we could just expose the "truth" with simple links to the Bugzilla search related to that milestone.
The sales and marketing team didn't like this. Their argument is that competitors use this against us to paint us as producers of buggy software. I argue that transparency is good, and beneficial — and whilst our competitors don't publish such information — but if we were to follow our competitors practices we simply follow them in the race to the bottom in terms of software quality and opaqueness.
In my opinion, transparency of software issues provides:
Identification of which release or build a certain issue is fixed.
Recognition that we are actively developing the software.
Incentive to improve quality controls as our "dirty laundry" is on display.
Information critical to 3rd party developers.
A projection of integrity and honesty.
I've yielded to the sales and marketing demands such that we no longer display the links next to each build for fixes and enhancements, and now publish "Development Stream (Experimental / Unstable" as simply "Development Stream") but I know what is coming next — a request to no longer make our Bugzilla database publicly accessible. I still have the Bugzilla database publicly exposed, but there is now only no longer the "click this link to see what we did in this build".
A compromise may be to make the Bugzilla database only visible to vetted resellers and developers — but I'm resistant to making a closed "exclusive" culture. I value transparency and recognize the benefits. The sales team are insistent that exposing such detail is a bad thing for sales.
I know by posting in a community like Slashdot that I'm going to get a lot of support for my views, but I'm also interested in what people think about the viewpoint that such transparency could be bad thing.
159 comments | about 2 months ago
New submitter RaDag writes: PostgreSQL outperformed MongoDB, the leading document database and NoSQL-only solution provider, on larger workloads than initial performance benchmarks. Performance benchmarks conducted by EnterpriseDB, which released the framework for public scrutiny on GitHub, showed PostgreSQL outperformed MongoDB in selecting, loading and inserting complex document data in key workloads involving 50 million records. This gives developers the freedom to combine structured and unstructured data in a single database with ACID compliance and relational capabilities.
147 comments | about 2 months ago
The recently disclosed bug in bash was bad enough as a theoretical exploit; now, reports Ars Technica, it could already be being used to launch real attacks. In a blog post yesterday, Robert Graham of Errata Security noted that someone is already using a massive Internet scan to locate vulnerable servers for attack. In a brief scan, he found over 3,000 servers that were vulnerable "just on port 80"—the Internet Protocol port used for normal Web Hypertext Transfer Protocol (HTTP) requests. And his scan broke after a short period, meaning that there could be vast numbers of other servers vulnerable. A Google search by Ars using advanced search parameters yielded over two billion web pages that at least partially fit the profile for the Shellshock exploit. More bad news: "[T]he initial fix for the issue still left Bash vulnerable to attack, according to a new US CERT National Vulnerability Database entry." And CNET is not the only one to say that Shellshock, which can affect Macs running OS X as well as Linux and Unix systems, could be worse than Heartbleed.
318 comments | about 2 months ago
HughPickens.com writes Ernesto reports at TorrentFreak that despite its massive presence the Pirate Bay doesn't have a giant server park but operates from the cloud, on virtual machines that can be quickly moved if needed. The site uses 21 "virtual machines" (VMs) hosted at different providers, up four machines from two years ago, in part due to the steady increase in traffic. Eight of the VMs are used for serving the web pages, searches take up another six machines, and the site's database currently runs on two VMs. The remaining five virtual machines are used for load balancing, statistics, the proxy site on port 80, torrent storage and for the controller. In total the VMs use 182 GB of RAM and 94 CPU cores. The total storage capacity is 620 GB. One interesting aspect of The Pirate Bay is that all virtual machines are hosted with commercial cloud hosting providers, who have no clue that The Pirate Bay is among their customers. "Moving to the cloud lets TPB move from country to country, crossing borders seamlessly without downtime. All the servers don't even have to be hosted with the same provider, or even on the same continent." All traffic goes through the load balancer, which masks what the other VMs are doing. This also means that none of the IP-addresses of the cloud hosting providers are publicly linked to TPB. For now, the most vulnerable spot appears to be the site's domain. Just last year TPB burnt through five separate domain names due to takedown threats from registrars. But then again, this doesn't appear to be much of a concern for TPB as the operators have dozens of alternative domain names standing by.
144 comments | about 2 months ago
Advocatus Diaboli writes:
According to a report from Gizmodo, "After six years and over one billion dollars in development, the FBI has just announced that its new biometric facial recognition software system is finally complete. Meaning that, starting soon, photos of tens of millions of U.S. citizen's faces will be captured by the national system on a daily basis. The Next Generation Identification (NGI) program will logs all of those faces, and will reference them against its growing database in the event of a crime. It's not just faces, though. Thanks to the shared database dubbed the Interstate Photo System (IPS), everything from tattoos to scars to a person's irises could be enough to secure an ID. What's more, the FBI is estimating that NGI will include as many as 52 million individual faces by next year, collecting identified faces from mug shots and some job applications." Techdirt points out that an assessment of how this system affects privacy was supposed to have preceded the actual rollout. Unfortunately, that assessment is nowhere to be found.
Two recent news items are related. First, at a music festival in Boston last year, face recognition software was tested on festival-goers. Boston police denied involvement, but were seen using the software, and much of the data was carelessly made available online. Second, both Ford and GM are working on bringing face recognition software to cars. It's intended for safety and security — it can act as authentication and to make sure the driver is paying attention to the road.
129 comments | about 2 months ago
An anonymous reader sends this news from El Reg: The U.K.'s National Health Service has ripped the Oracle backbone from a national patient database system and inserted NoSQL running on an open-source stack. Spine2 has gone live following successful redevelopment including redeployment on new, x86 hardware. The project to replace Spine1 had been running for three years with Spine2 now undergoing a 45-day monitoring period. Spine is the NHS’s main secure patient database and messaging platform, spanning a vast estate of blades and SANs. It logs the non-clinical information on 80 million people in Britain – holding data on everything from prescriptions and payments to allergies. Spine is also a messaging hub, serving electronic communications between 20,000 applications that include the Electronic Prescription Service and Summary Care Record. It processes more than 500 complex messages a second.
198 comments | about 2 months ago