Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!
Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!
While using a custom SSIS-based project to upload content of an arbitrarily chosen table from an on-premise SQL Server database to Azure PaaS-based SQL Database provides the most flexibility its inherent complexity might not be justified in more simplistic scenarios. Marcin Policht presents an alternative approach that addresses this concern by employing the bcp utility, which not only eliminates the need for any custom development, but also rivals SSIS in terms of its data transfer efficiency.
More than 900 big data practitioners, technologists and executives will gather at Structure Data in New York March 19-20 to investigate how big data can drive business success, and Microsoft will be there to take part in the conversation.
Around the world, big data and big compute are coming together to help enterprises and organizations create better products, make cities run smoother, advance disease prevention and treatments, and much more.
Today, Microsoft is working with customers to do just this. In the process, we’ll bring the power of big data to a billion people – connecting companies to previously untouched data sources and enabling everyone to gain insights through familiar and powerful tools that now simply live in Microsoft Excel.
At the event, John Platt, distinguished scientist at Microsoft Research, will share the latest on Microsoft’s work in machine learning in a fireside chat. We’ll also have data experts on hand to give attendees a taste of how easy it is to turn big data into beautiful visualizations and insights. Attendees will also get a peek at how Microsoft’s Cybercrime Center is using data to fight worldwide organized crime and BotNets. (Get a sneak peek by watching the video below!)
So please, join the conversation by attending the event and visiting with one of the Microsoft experts on hand. You can save 25% off your registration by using this special sponsor link.
Or, connect with us on Facebook.com/sqlserver and Twitter @SQLServer and learn how Microsoft’s approach to data helps employees, IT professionals and data scientists transform data into an organizational advantage.
AMD, a leading designer and integrator of technology that powers millions of intelligent devices, needed better tools for monitoring manufacturing processes and other business operations.
More than a terabyte of test information was loaded weekly into a data warehouse, which was used by business analysts to run thousands of custom queries. To accelerate performance and handle increasingly larger data sets, the company had implemented Microsoft SQL Server 2008 R2 Parallel Data Warehouse in 2011.
The challenge database managers faced was that the data visualization tools the company was using did not work well with Microsoft Excel spreadsheet software and productivity applications familiar to business users.
AMD decided to implement a BI solution based on SQL Server 2014 Enterprise and SharePoint Server 2013. The data warehouse team wanted to take advantage of built-in, self-service BI tools such as Power View, an interactive data visualization feature of Microsoft SQL Server 2014 Reporting Services. Jesse Cantu, IT Director of Data Warehousing and Engineering at AMD, says, “With SQL Server 2014 and SharePoint Server 2013, we saw an opportunity to gain a highly integrated environment that would give end users more control over their data without requiring us to increase the complexity of our architecture.”
The company expects that the BI tools will benefit multiple business processes worldwide, including supply chain management. Faster implementation and accelerated business insight will help AMD improve agility. Matthew Floyd, BI Architect sums up “We can empower employees throughout the company—from power users to people who are unfamiliar with analytics—to create reports. In turn, this frees our IT team to do what it does best, which is enhancing the data warehouse and implementing the latest enterprise BI tools.”
You can learn more about the AMD solution by reading the more detailed case study here.
Guest blog post by: SQL Server MVP Adam Jorgensen – PASS Executive Vice President, Finance & Governance, and President of Pragmatic Works – has been leading innovative SQL Server and Business Intelligence organizations for over a decade. His passion is finding new and creative avenues for clients and the community to embrace innovation and lower barriers to implementation. You can read his blog at AjBigData.com and follow him on Twitter at @AJBigData.
* * * * *
SQL Server 2014 is almost here! Many SQL Server professionals have been following the progress of this new release for a long time, and others have been so busy with day-to-day fire-fighting they haven’t had a chance to look at what’s coming. As the SQL Server team joins with PASS Virtual Chapters on a free “Countdown to SQL Server 2014” webinar series, I thought it would be a great time to highlight some of the features I’m most excited about.
Among the host of SQL Server 2014 capabilities that will enable innovative, new application and data platform opportunities, here are three of my favorites I can’t wait for the community to try out:
1. Delayed Durability
SQL Server 2014 lets us mark certain transactions as delayed durable, meaning control returns to the client before the log record is written to disk, as opposed to fully durable, which hardens the transaction log to disk before returning control to the client. Also called Lazy Commit, delayed durability can help reduce latency related to log I/O by keeping the transaction log records in memory and writing to the transaction log in batches. If you’re experiencing log I/O contention, it can also help reduce waits in the system. This setting – which we can control at the database, commit, or atomic level – provides for many new scalability opportunities and challenges. I’m looking forward to the solutions our community will create to leverage this capability.
2. New Cardinality Estimation Design
The cardinality estimator has been redesigned in SQL Server 2014 to improve query plan quality and query performance. This new estimator calculates cardinality, essentially the number of rows the optimizer processes in an operation, using assumptions and algorithms that support modern OLTP and data warehousing workloads. The Microsoft SQL Server engineering team did a lot of research on these workloads to deliver a modern algorithm set that is customer tested and proven. You can read more about the new estimator in Books Online: What’s New (Database Engine) and in the white paper “Testing cardinality estimation models in SQL Server.”
3. Clustered Columnstore Indexes
Since SQL Server 2012 introduced nonclustered columnstore indexes, many of us have been looking forward to clustered columnstore indexes and seeing this new SQL Server 2014 feature in action in our own environments. A clustered columnstore index will improve data compression and query performance for many data warehousing workloads, especially in read-heavy and bulk-loading scenarios. And because the CCI is updatable, we can perform Selects, Inserts, Updates, and Deletes on these tables while still getting the performance of a clustered columnstore.
These are just a few of the SQL Server 2014 features I think will be game-changers in many organizations, helping us deliver even higher performing and more highly scalable data systems. Which features are you most excited about? Let us know how you plan to use them. And I hope you’ll join PASS and Microsoft to learn more about this new release and how to take full advantage of it – sign up for your favorite webinars today.
See you out there!
Growing companies have a lot to worry about these days. How many new people will we need to staff? How do we produce more? How do we scale out our business? How do we reign in our costs? Where should we invest more? How is the market changing? How is the customer changing? How do we stay relevant and competitive? These are all questions that are impossible to answer without data, and data only becomes useful when you have the right tools to analyze it.
As a growing company, you probably have a lot of internal data created by your business and also access to external data that could tell you something about your market. Some of this data might be stored more securely on-premises, while some of it might be more accessible or scalable in the cloud for ease of access or collaboration. Your data might be coming from different systems you’re using, and therefore in varying form factors. There might be a lot of it, and you might wonder how you’ll continue to store and access that increasing volume of data in a cost effective way as your company continues to grow and generate even more data. You want the data to be easy for all your employees to analyze, so they don’t always have to rely on specialized data scientists in corporate IT. And finally, you know that every pitch would be more powerful with dynamic and visual data supporting it, because at the end of the day, data insights is about telling a story that is able to drive business actions.
Microsoft’s data platform offers all these things. In fact, Microsoft is the only technology vendor to offer all these things. And Microsoft is able to do it with leading innovations in enterprise technology that are integrated into a complete data platform.
Innovative Technology Leader
Many technology vendors describe their latest products as “innovative” or “game changers”, but how many actually have the technology to back up those claims? How many times have we heard “in-memory” data processing referred to as “revolutionary”, when in fact that same technology was already being used by other vendors, and the very concept of in-memory computing was invented, well, with computers!
Microsoft’s data platform features technology that is truly innovative. Allow me to call out a few examples:
Complete Data Platform
As we look into the future of data platforms, there are 3 key categories that will be integral to the success of any platform vendor. Any tech vendor that claims to have a complete platform should, at a minimum, cover these solutions.
Every vendor’s Big Data pitch will start with mind-blowing statistics on the exponential growth of data volume in today’s uber-connected world, and then talk about some level of integration with Hadoop. But that’s about where the similarities end as each vendor has taken a different approach to solving this problem. You will see some vendors redefining Big Data according to their own strengths, for example, focusing on the speed of data while ignoring the prohibitive costs of storing and processing high volumes of data in a model that’s not scalable. Other vendors will try to solve the Big Data problem with a big machine, and when your data outgrows that machine, well, you can purchase another machine.
Microsoft takes a hybrid IT approach to scaling out Big Data. Technologies like HDInsight and Polybase are the most seamless integrations with Hadoop amongst full stack vendors. And all this data can be combined and analyzed with Power BI in Microsoft Excel. It’s scalable, cost efficient, and yes, that simple! See how Virginia Tech is using Microsoft’s Big Data solutions in the cloud to transform their life sciences research.
Corporate and Self-Service BI
How do you balance the BI and analytics needs of a central IT organization with the varying needs of functional departments and individual contributors? Companies like Harper Collins were able to rapidly deploy an efficient and agile cloud BI solution that gave their employees control over data.
All within Microsoft Excel, you can easily pull in data from multiple sources, you can analyze that data, and create stunning visualizations, including the ability to play back data changes and trends over time. No more copying and pasting data sets or chart images into Excel! Microsoft also has a Q&A feature that allows users to query databases using natural language text like “unit sales by region” or “top 50 customers in Texas by revenue.”
Other BI vendors are either niche players without a supporting enterprise data platform, or offer the data platform without strong BI capabilities. Microsoft is the only complete data platform that offer BI solutions that are both powerful yet intuitive.
SQL Server has come a long way for mission critical workloads since its 2000 version from over a decade ago. Today, the SQL Server 14 platform includes in-memory capabilities across analytical and transactional workloads, unprecedented performance and scale, and an environment optimized for hybrid cloud. We now have a truly enterprise-grade, reliable, and secure data platform that customers trust for their mission critical workloads.
Unlike other vendors, Microsoft’s mission critical platform is available in your existing on-premises environment, in the cloud, or in hybrid environments to help you transition to a public or private cloud, and also our Parallel Data Warehouse appliance is available with your choice of commodity hardware. OLTP in SQL Server was designed for transactional processing, and all its innovative features are built right into SQL Server.
A great customer example is BMW using SQL Server 2014 to address its needs for a highly available, scalable, and cost effective mission critical platform. With SQL 14, BMW has gained nearly continuous availability, eliminated its storage replication, reduced troubleshooting time, and improved disaster recovery.
Microsoft is the only technology vendor that is able to provide a complete enterprise data platform – with a modern data warehouse, robust corporate BI capabilities, intuitive self-service BI capabilities, cloud SQL management, modern OLTP that is fast and reliable, and big data capabilities – all spanning on-premises, cloud, and hybrid environments. Scan through the list of other technology vendors your company currently has, and see how many of those vendors can score 100% on that checklist. (Hint: it should be zero)
Thanks for reading today! If you have an extra minute or two, make sure to check out the short video below summarizing how Microsoft is better able to help you get the most out of your data insights.
A Trustwave security researcher details database security issues and defensive techniques.
Greg Larsen discusses the BUCKET_COUNT setting of a HASH index and how to determine how well SQL Server distributes the In-Memory table rows across multiple buckets of a HASH index, by exploring a new DMV, provided with SQL Server 2014, along with the In-Memory OLTP table functionality.
NuoDB is a single, logical database that can be deployed via the cloud in multiple locations simultaneously.
A couple of weeks back was a really exciting time for us. Less than a year after we released Office 365 for Businesses, we announced the general availability of Power BI for Office 365. You may have read previous blog articles by Quentin Clark on “Making Big Data Work for Everyone” and Kamal Hathi on “Simplifying Business Intelligence through Power BI for Office 365”. In this article, we’ll outline how we think about visualizations.Why Visualizations Matter
While a list of items is great for entering or auditing data, data visualizations are a great way to distill information to what matters most that is understandable quickly. They work by engaging visual parts of our brains, which are inherently designed to detect patterns quickly. On the left, for example, we have a list of research grants, on the right a summarization of the overall amounts by month of the year which is a much easier way to understand the relative spikes in September and October. As you can see, visualizations are great for making us all more productive with data.Visualizations in Productivity Apps
We have the privilege of having the largest community of users of productivity applications in the world. Thanks to their ongoing feedback, we are able to understand their changing needs, and ensure Office evolves in directions our customers find most useful.
Back in the early days of computing, graphics and visualizations were not mainstream capabilities. Most machines lacked the graphics horsepower and memory needed, and people had to draw illustrations and charts by hand which was time consuming and fraught with mistakes. That all changed when the Macintosh and PCs made graphics a first class citizen. Microsoft introduced Microsoft Chart in 1984, and in 1985 as a feature in Mac Excel 1.0.
Advertisement, September 1984
This was a huge achievement. Customers could now use their computers to draw charts and other graphics for reports. Based on their feedback, numerous charting and graphics features were added to Office: charting was added to Excel 2.0 for Windows in 1987, 3D charts were introduced in 1990, and the chart wizard made it even easier to create charts in 1992. As computers spread from businesses into the homes for consumer uses, GPUs became more commonplace, and memory costs dropped, these innovations became broadly available.
Over the last decade, customer needs have evolved significantly, with more emphasis on faster creation, more interactivity, support for new types of data, and broader availability. Following is a list of innovations in recent releases, described to set the context of where we are as well as where we are headed.Faster Creation of Visualizations
Excel 2007 introduced the ability to set the style of a chart with one click and leverage richer graphics such as shadows, anti-aliased lines, and transparency.
Office 2013 was one of our most ground-breaking releases. Quick Analysis provided immediate views of various visualization options and reduced the time to add them on your data, Chart Recommendations used intelligent heuristics to suggest charts, and Automatic Relationship Detection allowed users to more easily analyze data stored in separate tables. The charting functionality provided live preview, gorgeous styles, richer data labels, and easier ways to add chart elements, apply styles, and filter with user interfaces directly on the charts.Quick Analysis in Excel 2013 Recommended Charts in Excel 2013 Richer Interactivity
Part of my role at Microsoft involves presenting on various topics to stakeholders, and increasingly most of these include data visualizations. Only a few years back, I remember creating presentations with snapshots of charts pasted in as images, and carrying a notepad with interesting data nuggets that others might ask about. Later, as I got more “sophisticated”, the appendix of these presentations would contain numerous slides, each with an interesting snippet that someone might ask about, and I could jump to the relevant one to dig in deeper. For the very important presentations, these slides were printed out beforehand, together with printouts of the underlying data that someone could go through on the spot to answer questions that’d come up.
Data visualizations inherently invite questions. And not just the simple ones, but deep insightful ones, that go beyond making better point decisions to having a deeper understanding of how the underlying system behaves. Having an environment where we can test our hypotheses quickly utilizes the best of our creativity and learnings, making us effective participants instead of mere spectators, full on drivers instead of backseat passengers giving directions once in a while.
To enable such experiences, Excel 2010 introduced Slicers, an interactive way to filter data within Excel, and Excel 2013 introduced Timelines to make it trivial to compare data over different time period.Slicers in Excel 2010 Timeline in Excel 2013
Excel 2013 also introduced Power View, and with it brought beautiful interactivity to visualizations and more fluid exploration capabilities.
Power View in Excel 2013
With Power View, customers can create dashboards of interactive visualizations that provide instant answers to variety of questions. This capability has resonated well with our customers, one of whom mentioned the rigidity of static snapshots during meetings has been replaced by the “Power View lifestyle”, their term for the transformational way of presenting and using information.
We are very excited to have introduced Q&A as part of the Power BI launch. This innovative experience makes it even easier to understand your data by providing a natural language experience that interprets your question and immediately serves up the correct answer on the fly in the form of an interactive chart or graph. These visualizations change dynamically as you modify the question, creating a truly interactive experience with your data.
Q&AVisualizations on All Data
In addition, both data volumes and the types of data customers want to visualize have expanded as well.
Excel 2013 also introduced the Data Model, opening the door for workbooks that contained significantly larger datasets than before, with richer way to express business logic directly within the workbook.
Increasingly, we have access to geospatial data, and recently introduced Power Map brings new 3D visualization tool for mapping, exploring, and interacting with geographical and temporal data to Excel, enabling people to discover and share new insights such as trends, patterns, and outliers in their data over time. In addition, with Power Map, users can easily capture and distribute their insights in the form of an interactive movie, telling compelling stories about their data.
Power Map in Excel 2013Visualizations Everywhere
As customers are creating insights and sharing them, we have also invested in ensuring SharePoint 2013 and Office 365 provide full fidelity rendering as the desktop client so their products remain beautiful wherever it’s consumed.What’s Next?
The resurgence of importance of data visualization in this decade, and rise of more form factors where these will be consumed have made this an exciting field once again. Deeper interactivity that blend analysis and visualizations even more fluidly, newer types of visualizations that enable you to see deeper insights more easily, richer experiences on the devices customers use most, and great storytelling experiences are just a few of the areas we’re investing in to make sure Office remains the productivity apps of choice as our customer needs evolve.
Partner Director of PM
Microsoft Office - Analytics and Presentation Services
Principal Group Program Manager
Office Data Experiences
For many of us, setting up an Oracle standby database has become fairly old hat. Just remember, keep everything on the standby server exactly the same as the primary, and everything will go fine. But what if you want your standby on the same server as the primary database? And why on earth would you want to do that anyway? Isn’t the point Disaster Recovery?
The PostgreSQL project has again been selected to take part in Summer of Code for 2014. Google will be funding several students to work with mentors from our project in order to hack PostgreSQL code for the summer.
Applications for students open March 10th. Our Summer Of Code page has all the information you need about applying this year.
If you are connected with a university, please make sure that students know about this opportunity. If you are a student, please apply!
The new In-Memory OLTP feature in SQL Server 2014 greatly optimizes the performance of certain OLTP applications. By using the new memory-optimized tables you can speed up data access, in particular in concurrency situations, due to the lock- and latch-free architecture of the In-Memory OLTP Engine. This means that applications which suffer from a lot of contention between concurrent transactions can greatly benefit from just migrating your hot tables to memory-optimized.
The other part of the equation are the new natively compiled stored procedures, which allow you to speed up query processing and business logic execution. These native procs are T-SQL stored procedures that are compiled to native code, in the form of DLLs, which are linked to the SQL Server process, for very efficient execution. The efficiency in natively compiled procs comes from savings in the execution path by baking the operations into machine code that can interact very efficiently with the In-Memory storage engine. For example, when scanning an index and identifying rows that match a certain predicate, we see that a native proc requires around 1/4th the number of CPU instructions compared with traditional interpreted T-SQL queries and stored procedures.
In this post we walk through some of the considerations when developing and using natively compiled stored procedures in your application.OLTP-Style Operations
Natively compiled stored procedures are optimized for OLTP-style operations. Now, what do we mean by that? Some characteristics: a) single-threaded execution (MAXDOP=1); b) point lookups and small range scans, no full table scans, in general operations that touch a relatively small number of rows; c) nested-loops joins and stream aggregation; d) short-running transactions, in the ideal case a transaction spans a single execution of a natively compiled stored procedure.
Some examples of OLTP-style operations:
Native procs are not optimized for reporting-style queries which require joins between and aggregation over large data sets.ATOMIC Blocks in Native Procs
The body of a natively compiled stored procedure must comprise exactly one ATOMIC block. ATOMIC blocks are a new concept in SQL Server 2014 that can be used only with native procs. The basic thing an ATOMIC block gives you is that all statements within the block either succeed or fail, atomically. In the context of transactions this means that:
Because transactions are handled through the ATOMIC block, there is no need to bother with BEGIN TRANSACTION, ROLLBACK, or COMMIT inside natively compiled stored procedures. In fact, that syntax is not supported.
For more details about ATOMIC blocks and transaction and error handling, see the corresponding topic in Books Online.Retry logic for handling failures
As with all transactions that touch memory-optimized tables, with natively compiled stored procedures you will need to consider retry logic to deal with potential failures such as write conflicts (error 41302) or dependency failures (error 41301). In most applications the failure rate will be low, but it is still necessary to deal with the failures by retrying the transaction. Two suggested ways of implementing retry logic are:
For more details on retry logic and the error conditions to consider, see the corresponding topic in Books Online.Table-Valued Parameters
Like traditional interpreted T-SQL stored procedures, natively compiled stored procedures support table-valued parameters (TVPs), which allow you to pass a rowset into a stored procedure. For example, if you want to insert a sales order along with its line items, you can use a TVP to encapsulate the line items.
The syntax and mechanisms to define and use table-valued parameters with natively compiled procs are the same as for interpreted procs. The only thing you need to take care of, is that you use a memory-optimized table type for the TVP. You can use memory-optimized table types with parameters in both native and interpreted stored procedures.
For more details and an example of the use of TVPs in natively compiled stored procedures, see the Books Online topic on memory-optimized table variables.Optimizing Client Invocation
In general, you can view a stored procedure execution as consisting of three phases:
Figure 1: Phases of stored procedure execution
If your application is sensitive to latency, i.e. how long it takes for a single stored procedure to execute, you will also want to optimize how you call the stored procedure from the client, in order to limit the overhead from the call of the stored procedure. Optimizing clients calls of stored procedures is not specific to natively compiled stored procedures, but it does play a bigger role for natively compiled procs, as the client invocation is proportionately a larger part of the overall procedure execution time, due to the optimization in the processing of queries and DML operations.
To optimize stored procedure calls, we recommend:
For an example of both direct and prepared execution with the ODBC driver in SQL Native Client see Books Online here.T-SQL Surface Area Limitations
SQL Server 2014 has some limitations on the features supported inside natively compiled stored procedures, which you should consider when using these stored procs, and if you want to get the most out of native procs.
Because of these limitations you will see it can be challenging to migrate stored procedures in your existing application to native. We suggest you look for patterns that fit the surface area for native procs and migrate those patterns to native. You do not always need to migrate an entire stored procedure: if the existing stored procedure has a substantial piece of logic that can be migrated to native, you can consider putting only that piece into a new native proc, and modify the existing proc to call the new one. Note that migrating a single statement to a natively compiled stored procedure may not be beneficial due to the overhead of stored procedure invocation – you really want to have a larger subset of the proc that you move to native.
To understand which features in an existing interpreted T-SQL stored procedure are supported in natively compiled stored procedures, we recommend using the Native Compilation Advisor, which is part of Management Studio in SQL Server 2014. The Advisor will tell you which features used in the stored procedure are not supported in native, which will help in identifying the parts of the procedure that can be migrated to native, and will indicate the limitations you may need to work around.
Two following two screenshots show an example of how to use the Advisor with the stored procedure dbo.uspGetBillOfMaterials in AdventureWorks.
Figure 2: Starting the Native Compilation Advisor for dbo.uspGetBillOfMaterials
Figure 3: Results of the Native Compilation Advisor for dbo.uspGetBillOfMaterials
In future blog posts we will go into more detail on how to perform a stored procedure migration given the surface area limitations, and give examples for migrating existing stored procedures to native.
The Change Data Capture feature of SQL Server captures DML changes happening on a tracked table. Arshad Ali demonstrates how this feature can be leveraged.
The PostgreSQL Global Development Group has released an important update to all supported versions of the PostgreSQL database system, which includes minor versions 9.3.3, 9.2.7, 9.1.12, 9.0.16, and 8.4.20. This update contains fixes for multiple security issues, as well as several fixes for replication and data integrity issues. All users are urged to update their installations at the earliest opportunity, especially those using binary replication or running a high-security application.Security Fixes
This update fixes CVE-2014-0060, in which PostgreSQL did not properly enforce the WITH ADMIN OPTION permission for ROLE management. Before this fix, any member of a ROLE was able to grant others access to the same ROLE regardless if the member was given the WITH ADMIN OPTION permission. It also fixes multiple privilege escalation issues, including: CVE-2014-0061, CVE-2014-0062, CVE-2014-0063, CVE-2014-0064, CVE-2014-0065, and CVE-2014-0066. More information on these issues can be found on our security page and the security issue detail wiki page.
With this release, we are also alerting users to a known security hole that allows other users on the same machine to gain access to an operating system account while it is doing "make check": CVE-2014-0067. "Make check" is normally part of building PostgreSQL from source code. As it is not possible to fix this issue without causing significant issues to our testing infrastructure, a patch will be released separately and publicly. Until then, users are strongly advised not to run "make check" on machines where untrusted users have accounts.Replication and Data Integrity Fixes
This update also fixes some issues which affect binary replication and row locking, and can cause recoverable data corruption in some cases. There are several fixes to index page locking issue during replication which can cause indexes on the replica to be corrupted. There is a fix to a transaction freezing bug in version 9.3 which could cause databases which cycled through transaction ID wraparound several times to have old row versions reappear. We have also fixed three bugs which could cause new standbys to fail to start up. Finally, this update fixes an issue which could break foreign keys, although the keys themselves will still need to be fixed manually after applying the update.
In version 9.3, these fixes result in the addition of several new server configuration settings to control multixact freezing. Importantly, standby servers must be updated to 9.3.3 or later before the replication master is updated, or replication will be broken.Other Improvements
In addition to the above, the following issues are fixed in this release:
There are also fixes to all of the following optional modules (extensions): ECPG, dblink, ISN, pgbench, pg_stat_statements and postgres_fdw. Additional changes and details of some of the above issues can be found in the Release Notes.
As with other minor releases, users are not required to dump and reload their database or use pg_upgrade in order to apply this update release; you may simply shut down PostgreSQL and update its binaries. Users who have skipped multiple update releases may need to perform additional post-update steps; see the Release Notes for details.