Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!
Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!
Big data applications and their associated proprietary, high-performance data stores arrived on the scene a few years ago. With promises of incredibly fast queries, many IT shops implemented one or more of these combination hardware and software suites. However, few IT enterprises have implemented metrics that clearly measure the benefits of these systems. The expected monetary gains from big data applications have not yet materialized for many companies, due to inflated expectations. The solution: Measure resource usage, and use these measurements to develop quality metrics.
At yesterday’s Accelerate your insights event in San Francisco, we heard from CEO Satya Nadella, COO Kevin Turner and CVP Quentin Clark about how building a data culture in your company is critical to success. By combining data-driven DNA with the right analytics tools, anyone can transform data into action.
Many companies, and many of our customers, are already experiencing the power of data - taking advantage of the fastest performance for their critical apps, and revealing insights from all their data, big and small.
Since SQL Server 2014 was released to manufacturing in April we’ve seen many stories featuring the new technical innovations in the product. In-memory transaction processing (In-Memory OLTP), speeds up an already very fast experience by delivering speed improvement of typically up to 30x. Korean entertainment giant CJ E&M is using In-Memory OLTP to attract more customers for its games by holding online giveaway events for digital accessories like character costumes and decorations soon after each game is released. When it ran tests in an actual operational environment for one of its most popular games, the results were that SQL Server 2014 delivered 35-times-faster performance over the 2012 version in both batch requests per second and I/O throughput.
SQL Server 2014 also delivers enhanced performance to data warehouse storage and query performance – NASDAQ OMX is using the In-Memory Columnstore for a particular system which handles billions of transactions per day, multiple petabytes of online data and has single tables with quintillions of records of business transactions. They have seen storage reduced by 50% and some query times reduced from days to minutes.
Lufthansa Systems is using the hybrid features of SQL 2014 to anticipate customer needs for high-availability and disaster-recovery solutions. It has piloted the combined power of Microsoft SQL Server 2014 and Windows Azure has led to even faster and fuller data recovery, reduced costs, and the potential for a vastly increased focus on customer service and solutions, compared with the company’s current solutions.
Growth in data volumes provides multiple challenges and opportunities. For executives and researchers at Oslo University Hospital providing ease of access to data is important. Using Power BI for Office 365, they can analyze data in hours rather than months, collaborate with colleagues around the country, and avoid traditional BI costs. For Virginia Tech the data deluge presents challenges for researchers in the life sciences where new types of unstructured data types from gene sequencing machines are generating petabytes of data. They are using the power of the cloud with Microsoft Azure HDInsight to not only analyzing data faster, but analyzing it more intelligently and which may in the future provide cures for cancer. For The Royal Bank of Scotland handling multiple terabytes of data and an unprecedented level of query complexity more efficiently led them to use the power of the Analytics Platform System (formerly Parallel Data Warehouse). As a result, it gained near-real-time insight into customers’ business needs as well as emerging economic trends, cut a typical four-hour query to less than 15 seconds, and simplified deployment.
Many customers are getting benefits from individual technologies, but Warner Brothers Games are using multiple BI technologies to provide a true global enterprise vision into metrics for executive management. They have used SQL Server to analyze structured data from finance and sales, HDInsight to analyze large amounts of unstructured data, such as social data and player trends, and SharePoint and the Power BI tools in Excel surfacing the data to executive management. The organization has gained new insights which helps drive new business strategies – you can watch the video here.
Whether you’ve already built a data culture in your organization, or if you’re new to exploring how you can turn insights into action, try the latest enhancements to these various technologies: SQL Server 2014, Power BI for Office 365, Microsoft Azure HDInsight, and the Microsoft Analytics Platform System.
Yesterday we talked about how we are delivering real-time performance to customers in every part of the platform. I’m excited to announce another example of where we are delivering this to customers in conjunction with one of our partners. Microsoft and Hewlett Packard broke two world records in TPC-H 10 Terabyte and 3 Terabyte benchmarks for non-clustered configuration for superior data warehousing performance and price-performance. Each of the world records showed SQL Server breaking the previously held record by Oracle/SPARC on both performance and price/performance1 by significant margins.
10TB: Running on a HP ProLiant DL580 Gen8 Server with SQL Server 2014 Enterprise Edition and Windows Server 2012 R2 Standard Edition, the configuration achieved a world record non-clustered performance of 404,005 query-per-hour (QphH) topping the previously held record from Oracle/SPARC of 377,594 query-per-hour (QphH) 1. The SQL Server configuration also shattered the Price/performance metric with a $2.34 USD Dollar/Query-per-Hour ($/QphH) topping Oracle’s $4.65 $/QphH1
3TB: Running on a HP ProLiant DL580 Gen8 Server with SQL Server 2014 Enterprise Edition and Windows Server 2012 R2 Standard Edition, the configuration achieved a world record non-clustered performance of 461,837 query-per-hour (QphH) topping the previously held record from Oracle/SPARC of 409,721 query-per-hour (QphH) 1. The SQL Server configuration also shattered the Price/performance metric with a $2.04 USD Dollar/Query-per-Hour ($/QphH) topping Oracle’s $3.94 $/QphH1
By breaking the world records for both performance and price/performance validates how SQL Server 2014 is delivering on leading in-memory performance at exceptional value. It also validates SQL Server’s leadership in data warehousing.
The TPC Benchmark™H (TPC-H) is an industry standard decision support benchmark that consists of a suite of business oriented ad-hoc queries and concurrent data modifications. The queries and the data populating the database have been chosen to have broad industry-wide relevance. This benchmark illustrates decision support systems that examine large volumes of data, execute queries with a high degree of complexity, and give answers to critical business questions. The performance metric is called the TPC-H Composite Query-per-Hour Rating and the price/performance metric is the cost / performance metric. More information can be found at http://www.tpc.org/tpch/results/tpch_perf_results.asp?resulttype=noncluster
For more information:
SQL Server 2014 HP 10TB TPC-H Result: http://www.tpc.org/tpch/results/tpch_result_detail.asp?id=114041502&layout=
2SQL Server 2014 HP 3TB TPC-H Result: http://www.tpc.org/tpch/results/tpch_result_detail.asp?id=114041501&layout=
Oracle 10TB TPC-H Result: http://www.tpc.org/tpch/results/tpch_result_detail.asp?id=113112501&layout=
Oracle 3TB TPC-H Result: http://www.tpc.org/tpch/results/tpch_result_detail.asp?id=113060701&layout=
Earlier today, Microsoft hosted a customer event in San Francisco where I joined CEO Satya Nadella and COO Kevin Turner to share our perspective on the role of data in business. Satya outlined his vision of a platform built for an era of ambient intelligence. He also stressed the importance of a “data culture” that encourages curiosity, action and experimentation – one that is supported by technology solutions that put data within reach of everyone and every organization.
Kevin shared how customers like Beth Israel Deaconess Medical Center, Condé Nast, Edgenet, KUKA systems, NASDAQ, telent, Virginia Tech and Xerox are putting Microsoft’s platform to work and driving real business results. He highlighted an IDC study on the tremendous opportunity for organizations to realize an additional $1.6 trillion dividend over the next four years by taking a comprehensive approach to data. According to the research, businesses that pull together multiple data sources, use new types of analytics tools and push insights to more people across their organizations at the right time, stand to dramatically increase their top-line revenues, cut costs and improve productivity.
A platform centered on people, data and analytics
In my keynote, I talked about the platform required to achieve the data culture and realize the returns on the data dividend – a platform for data, analytics and people.
It’s people asking questions about data that’s the starting point -- Power BI for Office 365 and Excel’s business intelligence features helps get them there. Data is key – data from all kinds of sources, including SQL Server, Azure and accessibility of the world’s data from Excel. Analytics brings order and sets up insights from broad data – analytics from SQL Server and Power BI for Office 365, and Azure HDInsight for running Hadoop in the cloud.
A platform that solves for people, data, and analytics accelerates with in-memory. We created the platform as customers are increasingly needing the technology to scale with big data, and accelerate their insights at the speed of modern business.
Having in-memory across the whole data platform creates speed that is revolutionary on its own, and with SQL Server we built it into the product that customers already know and have widely deployed. At the event we celebrated the launch of SQL Server 2014. With this version we now have in-memory capabilities across all data workloads delivering breakthrough performance for applications in throughput and latency. Our relational database in SQL Server has been handling data warehouse workloads in the terabytes to petabyte scale using in-memory columnar data management. With the release of SQL Server 2014, we have added in-memory Online Transaction Processing. In-memory technology has been allowing users to manipulate millions of records at the speed of thought, and scaling analytics solutions to billions of records in SQL Server Analysis Services.
The platform for people, data and analytics needs to be where the data and the people are. Our on-premises and cloud solutions provide endpoints for a continuum of how the realities of business manage data and experiences – making hybrid a part of every customer’s capability. Today we announced that our Analytics Platform System is generally available – this is the evolution of the Parallel Data Warehouse product that now supports the ability to query across the traditional relational data warehouse and data stored in a Hadoop region – either in the appliance or in a separate Hadoop cluster. SQL Server has seamless integration with VMs in Azure to provide secondaries for high availability and disaster recovery. The data people access in the business intelligence experience comes through Excel from their own data and partner data – and Power BI provides accessibility to wherever the data resides.
The platform for people, data and analytics needs to have full reach. The natural language search query Q&A feature in Power BI for Office 365 is significant in that it provides data insights to anyone that is curious enough to ask a question. We have changed who is able to reach insights by not demanding that everyone learn the vernacular of schemas and chart types. With SQL Server, the most widely-deployed database on the planet, we have many people who already have the skills to take advantage of all the capabilities of the platform. With a billion people who know how to use Excel, people have the skills to get engaged on the data.
Looking forward, we will be very busy. Satya mentioned some work we are doing in the Machine Learning space and today we also announced a preview of Intelligent Systems Service – just a couple of the things we are working to deliver a platform for the era of ambient intelligence. The Machine Learning work originates in what it takes to run services at Microsoft like Bing. We had to transform ML from a deep vertical domain into an engineering capability, and in doing so learned what it would take to democratize ML for our customers. Stay tuned.
The Internet of Things (IoT) space is very clearly one of the most important trends in data today. Not only do we envision the data from IoT solutions being well served by the data platform, but we need to ensure the end-to-end solution can be realized by any customer. To that end, Intelligent Systems Service (ISS) is an Internet of Things offering built on Azure, which makes it easier to securely connect, manage, capture and transform machine-generated data regardless of the operating system platform.
It takes a data platform built for the era of ambient intelligence with data, analytics and people to let companies get the most value from their data and realize a data culture. I believe Microsoft is uniquely positioned to provide this platform – through the speed of in-memory, our cloud and our reach. Built on the world’s most widely-deployed database, connected to the cloud through Azure, delivering insights to billions through Office and understanding the world through our new IoT service – it is truly a data platform for a new era. When you put it all together only Microsoft is bringing that comprehensive a platform and that much value to our customers.
Corporate Vice President
Data Platform Group
Tomorrow’s the day! Tune in to hear from Microsoft CEO Satya Nadella, COO Kevin Turner, and Data Platform Group CVP Quentin Clark about Microsoft’s approach to data, and how the latest advancements in technology can help you transform data into action.
Who should watch?
Join us tomorrow morning at 10AM PDT if you like data or want to learn more about it. If you store it, you manage it, you explore it, you slice and dice it, you analyze it, you visualize it, you present it, or if you make decisions based on it. If you’re architecting data solutions or deciding on the best data technology for your business. If you’re a DBA, business analyst, data scientist, or even just a data geek on the side, join the live stream.
What will I hear about?
Data infrastructure. Data tools. And ultimately, the power of data. From finding the connections that could cure cancer, to predicting the success of advertising campaigns, data can do incredible things. Join us online and get inspired. You’ll see how your peers are putting their data, big and small, to work.
From a product perspective, we’ll celebrate the latest advancements in SQL Server 2014, Power BI for Office 365, SQL Server Parallel Data Warehouse, and Microsoft Azure HDInsight. And ultimately, we’ll explore how these offerings can help you organize, analyze, and make sense of your data – no matter the size, type, or location.
Where do I sign up?
Tomorrow, April 15, at 10AM PDT. Be there.
See you tomorrow!
In the Importing XML Data into MySQL Tables Using a Stored Procedure article, Rob Gravelle outlined some ways to work around MySQL's restrictions on stored procedures to import XML data into your MySQL database tables. Today's article covers how to use a Prepared Statement, including error handling and validation, as well as handling additional XML formats.
Database .NET v11 is an innovative, powerful and intuitive multiple database management tool, With it you can Browse objects, Design tables, Edit rows, Export data and Run queries with a consistent interface. Free, All-In-One, Portable, Standalone (No Installation) and Multlanguage.New features from version 10.1 to 11.0:
14 April 2014: 2ndQuadrant is proud to announce the release of version 1.3.1 of Barman, Backup and Recovery Manager for PostgreSQL.
This minor release introduces support for concurrent backup using physical file based copy through "rsync", in conjunction with pgespresso, a new open source extension available for PostgreSQL 9.2 and 9.3. Concurrent backup allows database administrators that rely on Barman, to finally offload backup operations to a streaming replicated standby server, opening new important scenarios in disaster recovery architectures of PostgreSQL 9.2+ database servers.
The "barman diagnose" command has been implemented to print important information about the system and the configuration of Barman, allowing users to provide detailed diagnostics data in case of support requests.
Version 1.3.1 fixes an important bug on recovery that was affecting only those users having tablespaces created inside the PGDATA directory. This behaviour was introduced in version 1.3.0.
Minor bugs have also been fixed.
Many thanks for funding towards the development of this release go to Adyen (www.adyen.com).
For a complete list of changes, see the "Release Notes" section below.
Barman (Backup and Recovery Manager) is an open source administration tool for disaster recovery of PostgreSQL servers written in Python. It allows your organisation to perform remote backups of multiple servers in business critical environments and help DBAs during the recovery phase. Barmanâ€™s most requested features include backup catalogues, retention policies, remote backup and recovery, archiving and compression of WAL files and backups. Barman is distributed under GNU GPL 3.
With a product as complex as Oracle some bugs are bound to be present. Some of these bugs are show-stoppers, and others aren't, but it does teach you to pay careful attention to the results a query delivers. Even though queries are syntactically and logically correct you can't be certain that Oracle won't do something 'behind the scenes' that can produce the wrong answer.