Skip to content

Software Development News: .NET, Java, PHP, Ruby, Agile, Databases, SOA, JavaScript, Open Source

Methods & Tools

Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!

Database

Introducing the Microsoft Analytics Platform System – the turnkey appliance for big data analytics

At the Accelerate your Insights event last week, Satya Nadella introduced the new Microsoft Analytics Platform System (APS) as Microsoft’s solution for delivering “Big Data in a box.” APS is an evolution of our SQL Server Parallel Data Warehouse (PDW) appliance which builds upon the high performance and scale capabilities of that MPP version of SQL Server, and now introduces a dedicated region to the appliance for Hadoop in addition to the SQL Server PDW capabilities. The Hadoop region within the appliance is based on the Hortonworks Data Platform for Windows but adds key capabilities enterprises expect for a Tier 1 appliance such as high availability through the appliance design and Windows Server failover clustering, security through Active Directory and a unified appliance management experience through Systems Center. Completing the APS package and seamlessly unifying the data in SQL Server PDW with data in Hadoop is PolyBase, a ground breaking query technology developed by Dr. David DeWitt and his team in Microsoft’s Grey Systems Labs.

Microsoft continues to work with industry leading hardware partners Dell, HP and Quanta to deliver APS as a turnkey appliance that also delivers the best value in the industry for a data warehouse appliance.

Go to the APS product site to learn more or watch the short product introduction video here: 

Categories: Database

Progressive Insurance data performance grows by factor of four, fueling business growth online experience

At the Accelerate your Insights event last week, Quentin Clark described how SQL Server 2014 was now part of a platform that had in-built in-memory technology across all data workloads.  In particular with this release Microsoft has added in-memory Online Transaction Processing delivering breakthrough performance for applications in throughput and latency.

One of the early adopter customers of this technology is Progressive Insurance, a company that has long made customer service a competitive strength.  Central to customer service experience is the company’s policy-serving web app.  As it updated the app, Progressive planned to add its Special Lines business such as insuring motorcycles, recreational vehicles, boats, and even Segway electric scooters. However, Progressive needed to know that the additional workloads wouldn’t put a damper on the customer experience.

Progressive was interested in In-Memory OLTP capability, which can host online transaction processing (OLTP) tables and databases in a server’s working memory. The company tested In-Memory OLTP even before SQL Server 2014 became commercially available. Modifying the policy-serving app for the test was relatively straightforward, according to Craig Lanford, IT Manager at Progressive. 

The company modified eight natively compiled stored procedures, using already-documented code. In those tests, In-Memory OLTP boosted the processing rate from 5,000 transactions per second to 21,000—a 320 percent increase.

Lanford and his colleagues were delighted that the session-state database performance proved four times as fast with SQL Server 2014, adding “Our IT leadership team gave us the numbers we had to meet to support the increased database workload, and we far exceeded those numbers using Microsoft In-Memory OLTP”.  The company will use the throughput gain to support the addition of its Special Lines business to its policy-servicing app and session-state database. With its use of SQL Server 2014, Progressive can run a single, larger database reliably and avoid the cost of multiple databases.

You can read more about how Progressive is using SQL Server 2014 here.

Whether you’ve already built a data culture in your organization, or if you’re new to exploring how you can turn insights into action, try the latest enhancements to these various technologies: SQL Server 2014, Power BI for Office 365, Microsoft Azure HDInsight, and the Microsoft Analytics Platform System

Categories: Database

Boston Medical Center Supports High-Quality Care with Oracle Database Appliance

Oracle Database News - Tue, 04/22/2014 - 13:00
Boston Medical Center Supports High-Quality Care with Oracle Database Appliance -- ODA Helps Academic Medical Center Enhance Treatment and IT Operations /us/corporate/press/2191978 en
Categories: Database, Vendor

MySQL April Newsletter - Latest MySQL 5.7 DMR and MySQL Workbench 6.1 GA

MySQL AB - Tue, 04/22/2014 - 01:59
Welcome to the MySQL Newsletter for April 2014. We recently announced many new product releases, including the latest MySQL 5.7 development milestone release (DMR), the general availability of MySQL Workbench 6.1, as well as the release candidate of MySQL fabric. Learn more details on the new features and improvements in this edition.
Categories: Database, Vendor

American Modern Insurance Group Deploys Oracle Exadata Database Machine to Support Insurance Claims

Oracle Database News - Mon, 04/21/2014 - 13:00
American Modern Insurance Group Deploys Oracle Exadata Database Machine to Support Insurance Claims Process -- Leading Insurance Provider Reduces Costs, Ensures High Availability and Maximizes /us/corporate/press/2191967 en
Categories: Database, Vendor

Getting Started with Microsoft Power BI for Office 365

Database Journal News - Mon, 04/21/2014 - 08:01

Power BI is familiar, intuitive, cloud based self-service BI (Business Intelligence) solution for all your data needs in your very own Excel. It includes different tools for data discovery, analysis and visualization. Furthermore, Power BI integrates with Office 365 to share, collaborate and much more.

Categories: Database

Big Data Quality Metrics

Database Journal News - Thu, 04/17/2014 - 08:01

Big data applications and their associated proprietary, high-performance data stores arrived on the scene a few years ago. With promises of incredibly fast queries, many IT shops implemented one or more of these combination hardware and software suites. However, few IT enterprises have implemented metrics that clearly measure the benefits of these systems. The expected monetary gains from big data applications have not yet materialized for many companies, due to inflated expectations. The solution: Measure resource usage, and use these measurements to develop quality metrics.

Categories: Database

Customers using Microsoft technologies to accelerate their insights

At yesterday’s Accelerate your insights event in San Francisco, we heard from CEO Satya Nadella, COO Kevin Turner and CVP Quentin Clark about how building a data culture in your company is critical to success. By combining data-driven DNA with the right analytics tools, anyone can transform data into action.

Many companies, and many of our customers, are already experiencing the power of data - taking advantage of the fastest performance for their critical apps, and revealing insights from all their data, big and small.

Since SQL Server 2014 was released to manufacturing in April we’ve seen many stories featuring the new technical innovations in the product.  In-memory transaction processing (In-Memory OLTP), speeds up an already very fast experience by delivering speed improvement of typically up to 30x.  Korean entertainment giant CJ E&M is using In-Memory OLTP to attract more customers for its games by holding online giveaway events for digital accessories like character costumes and decorations soon after each game is released.   When it ran tests in an actual operational environment for one of its most popular games, the results were that SQL Server 2014 delivered 35-times-faster performance over the 2012 version in both batch requests per second and I/O throughput. 

SQL Server 2014 also delivers enhanced performance to data warehouse storage and query performance – NASDAQ OMX is using the In-Memory Columnstore for a particular system which handles billions of transactions per day, multiple petabytes of online data and has single tables with quintillions of records of business transactions.  They have seen storage reduced by 50% and some query times reduced from days to minutes. 

Lufthansa Systems is using the hybrid features of SQL 2014 to anticipate customer needs for high-availability and disaster-recovery solutions.  It has piloted the combined power of Microsoft SQL Server 2014 and Windows Azure has led to even faster and fuller data recovery, reduced costs, and the potential for a vastly increased focus on customer service and solutions, compared with the company’s current solutions.

Growth in data volumes provides multiple challenges and opportunities.  For executives and researchers at Oslo University Hospital providing ease of access to data is important.  Using Power BI for Office 365, they can analyze data in hours rather than months, collaborate with colleagues around the country, and avoid traditional BI costs.  For Virginia Tech the data deluge presents challenges for researchers in the life sciences where new types of unstructured data types from gene sequencing machines are generating petabytes of data.  They are using the power of the cloud with Microsoft Azure HDInsight to not only analyzing data faster, but analyzing it more intelligently and which may in the future provide cures for cancer.  For The Royal Bank of Scotland handling multiple terabytes of data and an unprecedented level of query complexity more efficiently led them to use the power of the Analytics Platform System (formerly Parallel Data Warehouse).  As a result, it gained near-real-time insight into customers’ business needs as well as emerging economic trends, cut a typical four-hour query to less than 15 seconds, and simplified deployment. 

Many customers are getting benefits from individual technologies, but Warner Brothers Games are using multiple BI technologies to provide a true global enterprise vision into metrics for executive management.  They have used SQL Server to analyze structured data from finance and sales, HDInsight to analyze large amounts of unstructured data, such as social data and player trends, and SharePoint and the Power BI tools in Excel surfacing the data to executive management. The organization has gained new insights which helps drive new business strategies – you can watch the video here.

Whether you’ve already built a data culture in your organization, or if you’re new to exploring how you can turn insights into action, try the latest enhancements to these various technologies: SQL Server 2014, Power BI for Office 365, Microsoft Azure HDInsight, and the Microsoft Analytics Platform System.

Categories: Database

SQL Server 2014 and HP Sets Two World Records for Data Warehousing Leading In Both Performance and Price/Performance

Yesterday we talked about how we are delivering real-time performance to customers in every part of the platform.  I’m excited to announce another example of where we are delivering this to customers in conjunction with one of our partners. Microsoft and Hewlett Packard broke two world records in TPC-H 10 Terabyte and 3 Terabyte benchmarks for non-clustered configuration for superior data warehousing performance and price-performance. Each of the world records showed SQL Server breaking the previously held record by Oracle/SPARC on both performance and price/performance1 by significant margins.

10TB: Running on a HP ProLiant DL580 Gen8 Server with SQL Server 2014 Enterprise Edition and Windows Server 2012 R2 Standard Edition, the configuration achieved a world record non-clustered performance of 404,005 query-per-hour (QphH) topping the previously held record from Oracle/SPARC of 377,594 query-per-hour (QphH) 1.  The SQL Server configuration also shattered the Price/performance metric with a $2.34 USD Dollar/Query-per-Hour ($/QphH) topping Oracle’s $4.65 $/QphH1

3TB: Running on a HP ProLiant DL580 Gen8 Server with SQL Server 2014 Enterprise Edition and Windows Server 2012 R2 Standard Edition, the configuration achieved a world record non-clustered performance of 461,837 query-per-hour (QphH) topping the previously held record from Oracle/SPARC of 409,721 query-per-hour (QphH) 1.  The SQL Server configuration also shattered the Price/performance metric with a $2.04 USD Dollar/Query-per-Hour ($/QphH) topping Oracle’s $3.94 $/QphH1

By breaking the world records for both performance and price/performance validates how SQL Server 2014 is delivering on leading in-memory performance at exceptional value. It also validates SQL Server’s leadership in data warehousing.

The TPC Benchmark™H (TPC-H) is an industry standard decision support benchmark that consists of a suite of business oriented ad-hoc queries and concurrent data modifications. The queries and the data populating the database have been chosen to have broad industry-wide relevance. This benchmark illustrates decision support systems that examine large volumes of data, execute queries with a high degree of complexity, and give answers to critical business questions.  The performance metric is called the TPC-H Composite Query-per-Hour Rating and the price/performance metric is the cost / performance metric. More information can be found at http://www.tpc.org/tpch/results/tpch_perf_results.asp?resulttype=noncluster

Eron Kelly,
General Manager
SQL Server

 

For more information:

 

1As of April 15, 2014.

SQL Server 2014 HP 10TB TPC-H Result: http://www.tpc.org/tpch/results/tpch_result_detail.asp?id=114041502&layout=

2SQL Server 2014 HP 3TB TPC-H Result: http://www.tpc.org/tpch/results/tpch_result_detail.asp?id=114041501&layout=
Oracle 10TB TPC-H Result: http://www.tpc.org/tpch/results/tpch_result_detail.asp?id=113112501&layout=

Oracle 3TB TPC-H Result: http://www.tpc.org/tpch/results/tpch_result_detail.asp?id=113060701&layout=

Categories: Database

The data platform for a new era

Earlier today, Microsoft hosted a customer event in San Francisco where I joined CEO Satya Nadella and COO Kevin Turner to share our perspective on the role of data in business. Satya outlined his vision of a platform built for an era of ambient intelligence. He also stressed the importance of a “data culture” that encourages curiosity, action and experimentation – one that is supported by technology solutions that put data within reach of everyone and every organization. 

Kevin shared how customers like Beth Israel Deaconess Medical Center, Condé Nast, Edgenet, KUKA systems, NASDAQ, telent, Virginia Tech and Xerox are putting Microsoft’s platform to work and driving real business results. He highlighted an IDC study on the tremendous opportunity for organizations to realize an additional $1.6 trillion dividend over the next four years by taking a comprehensive approach to data. According to the research, businesses that pull together multiple data sources, use new types of analytics tools and push insights to more people across their organizations at the right time, stand to dramatically increase their top-line revenues, cut costs and improve productivity. 

A platform centered on people, data and analytics
In my keynote, I talked about the platform required to achieve the data culture and realize the returns on the data dividend – a platform for data, analytics and people. 

It’s people asking questions about data that’s the starting point -- Power BI for Office 365 and Excel’s business intelligence features helps get them there. Data is key – data from all kinds of sources, including SQL Server, Azure and accessibility of the world’s data from Excel. Analytics brings order and sets up insights from broad data – analytics from SQL Server and Power BI for Office 365, and Azure HDInsight for running Hadoop in the cloud.

A platform that solves for people, data, and analytics accelerates with in-memory. We created the platform as customers are increasingly needing the technology to scale with big data, and accelerate their insights at the speed of modern business. 

Having in-memory across the whole data platform creates speed that is revolutionary on its own, and with SQL Server we built it into the product that customers already know and have widely deployed. At the event we celebrated the launch of SQL Server 2014. With this version we now have in-memory capabilities across all data workloads delivering breakthrough performance for applications in throughput and latency. Our relational database in SQL Server has been handling data warehouse workloads in the terabytes to petabyte scale using in-memory columnar data management. With the release of SQL Server 2014, we have added in-memory Online Transaction Processing. In-memory technology has been allowing users to manipulate millions of records at the speed of thought, and scaling analytics solutions to billions of records in SQL Server Analysis Services. 

The platform for people, data and analytics needs to be where the data and the people are. Our on-premises and cloud solutions provide endpoints for a continuum of how the realities of business manage data and experiences – making hybrid a part of every customer’s capability. Today we announced that our Analytics Platform System is generally available – this is the evolution of the Parallel Data Warehouse product that now supports the ability to query across the traditional relational data warehouse and data stored in a Hadoop region – either in the appliance or in a separate Hadoop cluster. SQL Server has seamless integration with VMs in Azure to provide secondaries for high availability and disaster recovery. The data people access in the business intelligence experience comes through Excel from their own data and partner data – and Power BI provides accessibility to wherever the data resides.  

The platform for people, data and analytics needs to have full reach. The natural language search query Q&A feature in Power BI for Office 365 is significant in that it provides data insights to anyone that is curious enough to ask a question. We have changed who is able to reach insights by not demanding that everyone learn the vernacular of schemas and chart types. With SQL Server, the most widely-deployed database on the planet, we have many people who already have the skills to take advantage of all the capabilities of the platform. With a billion people who know how to use Excel, people have the skills to get engaged on the data.

Looking forward, we will be very busy. Satya mentioned some work we are doing in the Machine Learning space and today we also announced a preview of Intelligent Systems Service – just a couple of the things we are working to deliver a platform for the era of ambient intelligence. The Machine Learning work originates in what it takes to run services at Microsoft like Bing. We had to transform ML from a deep vertical domain into an engineering capability, and in doing so learned what it would take to democratize ML for our customers. Stay tuned. 

The Internet of Things (IoT) space is very clearly one of the most important trends in data today. Not only do we envision the data from IoT solutions being well served by the data platform, but we need to ensure the end-to-end solution can be realized by any customer. To that end, Intelligent Systems Service (ISS) is an Internet of Things offering built on Azure, which makes it easier to securely connect, manage, capture and transform machine-generated data regardless of the operating system platform.

It takes a data platform built for the era of ambient intelligence with data, analytics and people to let companies get the most value from their data and realize a data culture. I believe Microsoft is uniquely positioned to provide this platform – through the speed of in-memory, our cloud and our reach. Built on the world’s most widely-deployed database, connected to the cloud through Azure, delivering insights to billions through Office and understanding the world through our new IoT service – it is truly a data platform for a new era. When you put it all together only Microsoft is bringing that comprehensive a platform and that much value to our customers.

 

Quentin Clark
Corporate Vice President
Data Platform Group

Categories: Database

Tune in tomorrow and accelerate your insights

Tomorrow’s the day! Tune in to hear from Microsoft CEO Satya Nadella, COO Kevin Turner, and Data Platform Group CVP Quentin Clark about Microsoft’s approach to data, and how the latest advancements in technology can help you transform data into action.

Who should watch?

Join us tomorrow morning at 10AM PDT if you like data or want to learn more about it. If you store it, you manage it, you explore it, you slice and dice it, you analyze it, you visualize it, you present it, or if you make decisions based on it. If you’re architecting data solutions or deciding on the best data technology for your business. If you’re a DBA, business analyst, data scientist, or even just a data geek on the side, join the live stream.

What will I hear about?

Data infrastructure. Data tools. And ultimately, the power of data. From finding the connections that could cure cancer, to predicting the success of advertising campaigns, data can do incredible things. Join us online and get inspired. You’ll see how your peers are putting their data, big and small, to work.

From a product perspective, we’ll celebrate the latest advancements in SQL Server 2014, Power BI for Office 365, SQL Server Parallel Data Warehouse, and Microsoft Azure HDInsight. And ultimately, we’ll explore how these offerings can help you organize, analyze, and make sense of your data – no matter the size, type, or location.

Where do I sign up?

Mark your calendar now or RSVP on Facebook so you’re ready to go tomorrow. When streaming goes live, you can join us here for all the action live from San Francisco.

When do things get started?

Tomorrow, April 15, at 10AM PDT. Be there.

See you tomorrow!

Categories: Database

Enhance Your MySQL XML Import Procedures using Prepared Statements

Database Journal News - Mon, 04/14/2014 - 08:01

In the Importing XML Data into MySQL Tables Using a Stored Procedure article, Rob Gravelle outlined some ways to work around MySQL's restrictions on stored procedures to import XML data into your MySQL database tables. Today's article covers how to use a Prepared Statement, including error handling and validation, as well as handling additional XML formats.

Categories: Database

Database .NET 11 released!

PostgreSQL News - Mon, 04/14/2014 - 01:00

Database .NET v11 is an innovative, powerful and intuitive multiple database management tool, With it you can Browse objects, Design tables, Edit rows, Export data and Run queries with a consistent interface. Free, All-In-One, Portable, Standalone (No Installation) and Multlanguage.

New features from version 10.1 to 11.0:
  • Compatible with the latest versions of PostgreSQL.
  • Updated to Npgsql.dll 2.1.3
  • Added Support for JSON data type of PostgreSQL
  • Added Support for renaming all objects of PostgreSQL
  • Added Executing SQL Statements from a text file
  • Added Displaying Row Count of tables
  • Added Displaying Connection time
  • Added Data Editor for View objects
  • Added Search Table Data
  • Added Empty Table
  • Added New Connection Manager
  • Added New Object Navigator
  • Added Script All Table Data (INSERTs)
  • Added IntelliSense for Cross-schema table access
  • Added Selected Text to Query Builder
  • Added Dynamic Context Menu[?]
The new version is immediately available for download.
Categories: Database, Open Source

Barman 1.3.1 released

PostgreSQL News - Mon, 04/14/2014 - 01:00

14 April 2014: 2ndQuadrant is proud to announce the release of version 1.3.1 of Barman, Backup and Recovery Manager for PostgreSQL.

This minor release introduces support for concurrent backup using physical file based copy through "rsync", in conjunction with pgespresso, a new open source extension available for PostgreSQL 9.2 and 9.3. Concurrent backup allows database administrators that rely on Barman, to finally offload backup operations to a streaming replicated standby server, opening new important scenarios in disaster recovery architectures of PostgreSQL 9.2+ database servers.

The "barman diagnose" command has been implemented to print important information about the system and the configuration of Barman, allowing users to provide detailed diagnostics data in case of support requests.

Version 1.3.1 fixes an important bug on recovery that was affecting only those users having tablespaces created inside the PGDATA directory. This behaviour was introduced in version 1.3.0.

Minor bugs have also been fixed.

Many thanks for funding towards the development of this release go to Adyen (www.adyen.com).

For a complete list of changes, see the "Release Notes" section below.

Links

Release notes

  • Added support for concurrent backup of PostgreSQL 9.2 and 9.3 servers that use the "pgespresso" extension. This feature is controlled by the "backup_options" configuration option (global/server) and activated when set to "concurrent_backup". Concurrent backup allows DBAs to perform full backup operations from a streaming replicated standby.
  • Added the "barman diagnose" command which prints important information about the Barman system (extremely useful for support and problem solving)
  • Improved error messages and exception handling interface
  • Fixed bug in recovery of tablespaces that are created inside the PGDATA directory (bug introduced in version 1.3.0)
  • Fixed minor bug of unhandled -q option, for quiet mode of commands to be used in cron jobs (bug introduced in version 1.3.0)
  • Minor bug fixes and code refactoring

Download

About Barman

Barman (Backup and Recovery Manager) is an open source administration tool for disaster recovery of PostgreSQL servers written in Python. It allows your organisation to perform remote backups of multiple servers in business critical environments and help DBAs during the recovery phase. Barman’s most requested features include backup catalogues, retention policies, remote backup and recovery, archiving and compression of WAL files and backups. Barman is distributed under GNU GPL 3.

Categories: Database, Open Source

Create a business intelligence and analytics service in Ruby with BLU Acceleration on BlueMix

IBM - DB2 and Informix Articles - Fri, 04/11/2014 - 05:00
The BLU Acceleration Service available in IBM Codename: BlueMix provides a powerful, easy-to-use, and agile platform for business intelligence and analytics. It is an enterprise-class managed service that is powered by the in-memory optimized, column-organized BLU Acceleration data warehouse technology. This article demonstrates how easy it is to incorporate the BLU Acceleration service into your application so that you can focus on your application.
Categories: Database

Oracle 11.2 Outer Join And Index Issue

Database Journal News - Thu, 04/10/2014 - 08:01

With a product as complex as Oracle some bugs are bound to be present. Some of these bugs are show-stoppers, and others aren't, but it does teach you to pay careful attention to the results a query delivers. Even though queries are syntactically and logically correct you can't be certain that Oracle won't do something 'behind the scenes' that can produce the wrong answer.

Categories: Database

Postgres Open 2014 - Opens the Call for Papers

PostgreSQL News - Thu, 04/10/2014 - 01:00

Postgres Open 2014 will be held in Chicago, IL, at the Hotel Sax, September 17 - 19, 2014. It will feature two full days of multiple parallel tracks of PostgreSQL presentations (September 18 - 19th) from both local and global speakers, covering a wide range of topics. In addition we will also be offering a separate day of tutorials (Wednesday, September 17th). For more information about the conference, please see our website http://postgresopen.org/2014

The Program Committee is currently accepting proposals for presentations at the conference. We are interested in submissions from both seasoned PostgreSQL experts and people new in the community, from both locals and representatives of the global community. In short, from anybody who has an interesting story to tell about PostgreSQL, whether deeply technical or story about a successful (or failed) usage. All presentations are 45 minutes, with time for questions. Talks can be submitted via the website: http://postgresopen.org/2014/callforpapers/

Our early-bird ticket registration will open in May 19, 2014 and are available through June 30, 2014, after which tickets will go up to their regular price.

Finally, we are also looking for sponsors! We several tiers of sponsorship, to make sure there is a choice for everybody. If you are interested, please see the Postgres Open sponsor page http://postgresopen.org/2014/becomesponsor/

We look forward to seeing you in Chicago in September!

Categories: Database, Open Source

SQL Server 2014 Launch – A Community Affair

 Guest blog post by: PASS President Thomas LaRock – a SQL Server MVP, MCM, and Head Geek at Solarwinds – is a seasoned IT professional with over a decade of technical and management experience. Author of DBA Survivor: Become a Rock Star DBA, he holds an MS degree in Mathematics from Washington State University and is a Microsoft Certified Trainer and a VMware vExpert. You can read his blog at thomaslarock.com and follow him on Twitter at @SQLRockstar.

*     *     *     *     *

April opened with the general availability of SQL Server 2014. But well before we could wrap our hands around the final bits of the new release, the SQL Server community has been getting an early taste of its exciting performance, availability, manageability, and cloud features, thanks to a grassroots launch and readiness program that has spread around the globe.

The Professional Association for SQL Server (PASS) and hundreds of our volunteers around the world have joined with Microsoft to host free SQL Server 2014 launch events and technical sessions that focus on what matters most to data pros. These sessions explain the new features in the release, how the features work, and how we can use them to benefit our companies.

From user group meetings and PASS SQLSaturday sessions to the ongoing SQL Server 2014 Countdown webinars with PASS Virtual Chapters, the launch of SQL Server 2014 has truly been a community affair – and we're just getting started. Whether you're already on the path to early adoption, preparing to take advantage of the new release soon, or gathering information for the future, here's how you can get involved and get the details you need to make smart decisions for your organization:

  • Connect with fellow SQL Server pros: Microsoft Data Platform Group GM Eron Kelly noted that for Community Technology Preview 2, there were nearly 200K evaluations of SQL Server 2014, including 20K evaluations with the new release running in a Microsoft Azure Virtual Machine. That's a lot of folks who now have first-hand knowledge about SQL Server 2014. Check out those blogging and speaking about their experiences and sharing at chapter meetings, and then get to know them and what they know.
  • Share your questions, issues, and solutions: Have you tried out SQL Server's new built-in in-memory OLTP features? How about the enhanced mission-critical and availability capabilities? Have questions about implementing a hybrid data solution that bridges on-premises and cloud technologies? And how and when should you use the new delayed durability setting or clustered columnstore indexes? Share your experiences – and what you don't know or need more information about – and help the community build up resources that enable us all to work better, smarter, and faster.
  • Learn how to get the most from your data: Go inside the new release with experts on the SQL Server product team at upcoming live SQL Server 2014 Countdown webinars and watch on-demand replays of those you missed. You can also learn more about SQL Server 2014 and Microsoft's data platform strategy at the Accelerate Your Insights online launch event April 15 with Microsoft CEO Satya Nadella, COO Kevin Turner, and Data Platform Group Corporate Vice President Quentin Clark. And remember to check with your local PASS chapter, Virtual Chapter, or nearby SQLSaturday event for more SQL Server 2014 launch and learning events happening worldwide.

I'm grateful to be part of one of the most passionate technology communities in the world and excited to participate in a SQL Server 2014 launch program that, at its core, is about empowering SQL Server professionals and their organizations to be successful.

Thanks to everyone who is helping connect, share, and learn about SQL Server 2014.
Thomas

Categories: Database

MongoDB 2.6 Improves Open-Source Database Performance

Database Journal News - Tue, 04/08/2014 - 18:06

Index Intersection, text search and performance gains land in MongoDB 2.6 release

Categories: Database

SQL Server 2014 brings on-premises and cloud database together to improve data availability and disaster recovery

With the recently disclosed general availability of SQL Server 2014, Microsoft brings to market new hybrid scenarios, enabling customers to take advantage of Microsoft Azure in conjunction with on-premises SQL Server.

SQL Server 2014 helps customers to protect their data and make it more highly availably using Azure. SQL Server Backup to Microsoft Azure builds on functionality first introduced in SQL Server 2012, introducing a UI for easily configuring backup to Azure from SQL Server Management Studio (SSMS). Backups are encrypted and compressed, enabling fast and secure cloud backup storage. Set up requires only Azure credentials and an Azure storage account. For help getting started, this step-by-step guide will get you going with the easy, three-step process.

Storing backup data in Azure is cost-effective, secure, and inherently offsite, making it a useful component in business continuity planning. A March 2014 commissioned study conducted by Forrester Consulting on Microsoft's behalf about Cloud Backup and Disaster Recovery found that saving money on storage is the top benefit of cloud database backup, cited by 61%, followed closely by 50% who said savings on administrative cost was a top reason for backing up to the cloud. Backups stored in Azure also benefit from Azure built-in geo-redundancy and high services levels, and can be restored to a Azure VM for fast recovery from onsite outages.

In addition to the SQL Server 2014 functionality for backing up to Azure, we have now made generally available a free standalone SQL Server Backup to Microsoft Azure Tool that can encrypt and compress backup files for all supported versions of SQL Server, and store them in Azure—enabling a consistent backup to cloud strategy across your SQL Server environments. This fast, easy to configure tool enables you to quickly create rules that direct a set of backups to Azure rather than local storage as well as select encryption and compression settings.

Another new business continuity planning scenario enabled by SQL Server 2014 is disaster recovery (DR) in the cloud. Customers are now able to setup an asynchronous replica in Azure as part of an AlwaysOn high availability solution. A new SSMS wizard enables you to simplify the deployment of replicas on-premises and to Azure. As soon as a transaction is committed on-premises it is sent asynchronously to the cloud replica. We still recommend you keep your synchronous replica on-premises, but by having the additional replicas in Azure you gain improved DR and can reduce the CAPEX and OPEX costs of physically maintaining additional hardware in additional data centers.

Another benefit of keeping an asynchronous replica in Azure is that the replica can be efficiently utilized for read functionality like BI reporting or utilized for doing backups, speeding up the backup to Azure process as the secondary is in Azure already.

But the greatest value to customers of an AlwaysOn replica in Azure is the speed to recovery. Customers are finding that their recovery point objectives (RPO) can be reduced to limit data loss, and their recovery time objectives (RTO) can be measured in seconds:

  • Lufthansa Systems is a full-spectrum IT consulting and services organization that serves airlines, financial services firms, healthcare systems, and many more businesses. To better anticipate customer needs for high-availability and disaster-recovery solutions, Lufthansa Systems piloted a solution on SQL Server 2014 and Azure that led to faster and more robust data recovery, reduced costs, and the potential for a vastly increased focus on customer service and solutions. They expect to deploy the solution on a rolling basis starting in 2014.
  • Amway is a global direct seller. Amway conducted a pilot test of AlwaysOn Availability Groups for high availability and disaster recovery. With multisite data clustering with failover to databases hosted both on-premises and in Azure, Amway found that the test of SQL Server AlwaysOn with Azure replicas delivered 100 percent uptime and failover took place in 10 seconds or less. The company is now planning how best to deploy the solution.

Finally, SQL Server 2014 enables you to move your database files to Azure while keeping your applications on-premises for bottomless storage in the cloud and greater availability. The SQL Server Data Files in Microsoft Azure configuration also provides an alternative storage location for archival data, with cost effective storage and easy access.

If you're ready to evaluate how SQL Server 2014 can benefit your database environment, download a trial here. For greater flexibility deploying SQL Server on-premises and in the cloud, sign up for a free Azure evaluation. And, to get started backing up older versions of SQL Server to Azure, try our free standalone backup tool. Also, don't forget to save the date for the live stream of our April 15 Accelerate Your Insights event to hear more about our data platform strategy from CEO Satya Nadella, COO Kevin Turner and CVP of Data Platform Quentin Clark.

Categories: Database