Skip to content

Software Development News: .NET, Java, PHP, Ruby, Agile, Databases, SOA, JavaScript, Open Source

Methods & Tools

Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!


Real-time Operational Analytics in SQL Server 2016 - Part 3

Database Journal News - 6 hours 1 min ago

SQL Server 2016 introduces the Real-time Operational Analytics feature, which allows you to run both transactional/operational and analytics workloads on the same system. Read on to learn how to create a filtered, non-clustered columnstore index.

Categories: Database

World’s Biggest SQL Server Event is Coming in 6 Days!

This post is by Joseph Sirosh, Corporate Vice President of the Data Group at Microsoft.

SQL Server 2016

We are only 6 days away from PASS Summit 2016 (@sqlpass) which kicks off next Wednesday, October 26th, in Seattle. SQL PASS Summit (#SQLSummit, #sqlpass) is the world’s largest and most intensive technical training conference for Microsoft SQL Server (@SQLServer) and BI professionals. But more than that, it’s a conference – planned and presented by the SQL Server community for the SQL Server community (#sqlfamily). It has the most technical sessions, the largest number of attendees, the best networking, and the highest-rated sessions and speakers of any SQL Server event in the world. If you are in the Seattle area and would like to attend the Summit, you still have time to register to attend. If you cannot make it, the event will be live-streamed just like last year.

PASS Summit 2016

I am very excited to give a keynote again at PASS Summit this year. I have a lot of exciting things to share with you since last October. SQL Server 2016, an industry leader, and now packs an even bigger punch in the recent Gartner Magic Quadrant for Operational Database Management Systems. Microsoft is positioned as a leader, highest in execution and furthest in vision for 2 years in a row. SQL Server 2016 builds on this leadership, and is packed with powerful built-in features. As the least vulnerable database for six years in a row, SQL Server 2016 offers security that no other database can match. It also has the data warehouse (Data Warehouse Fast Track) with the highest price-performance, and offers end-to-end mobile BI solutions on any device at a fraction of the cost of other vendors. It provides tools to go beyond BI with in-database Advanced Analytics, integrating the R language and scalable analytics functions. The world around us – every business and nearly every industry – is being transformed by technology today, and SQL Server 2016 was built for this new world and to help businesses get ahead of today’s disruptions.

I am very excited to present at PASS our cutting-edge content around SQL Server 2016 to solve real-world problems, and I will have a number of top-notch speakers joining me on the stage.

Rohan Kumar

Rohan Kumar (@RohanKSQL), General Manager of Database Systems, will join me on the stage and talk about SQL Server 2016 and how customers around the world rely on SQL Server to build applications that deliver unmatched security and performance for their most demanding workloads.

Rohan Kumar Keynote

Rohan will also present a fascinating demo Machine Learning @ 1,000,000 predictions per second, in which he will show real-time predictive fraud detection and scoring in SQL Server 2016. This is made possible by combining the performance of SQL Server in-memory OLTP as well as in-memory columnstore with R and Machine Learning. Rohan will also talk about HTAP – how SQL Server 2016 enables you to analyze both historical data as well as real-time in-memory processing of both transactions and analytics with 100x faster in-memory analytics and 30x faster in-memory OLTP. Rohan will talk about our “better together” story – Windows Server 2016 and SQL Server 2016 working together. With the release of Windows Server 2016, SQL Server supports unparalleled scalability with a single SMP machine supporting today up to 24TB of memory and 640 cores.


Then joining Rohan on the stage will be one of our customers PROS (@PROS_Inc), and they will be talking about how SQL Server 2016 with R built-in enabled them to better serve their customers. One of their customers is an airline company that needed to respond to over 100 million unique, individualized price requests each day in under 200 milliseconds. It’s practically impossible for humans to do this – understanding the market economics using all available data and to do so in under 200 milliseconds. The combination of SQL Server 2016 and Azure cloud provided the unified platform and global footprint that made it a lot easier for PROS to accomplish this and they will tell you all about it.

By the way, I highly recommend this free ebook that just came out, Data Science with Microsoft SQL Server 2016, to learn how to install, configure, and use Microsoft’s SQL Server R Services in data science projects and do the scenarios similar to PROS.


Rohan will tell you how Microsoft and Intel (@Intel) engineering work closely together to tune and optimize our technologies for performance. This is an exciting year for both companies with the latest Intel Xeon processor E5 v4 family and introduction of SQL Server 2016. The performance increase in SQL Server 2016 has excelled when you refresh both hardware and software, with high double-digit performance gains. Furthermore, data is continually growing and this is another step Intel and Microsoft are taking together to enable scalable, cost effective, large data warehouses for advanced analytics. Intel and Microsoft engineering collaborated to enable a 100TB data warehouse within a single 4-socket Intel Xeon E7 v4 server, running Intel SATA SSD Storage. Intel will come and share these impressive performance results with PASS community.


Another fascinating customer that will join me at PASS is NextGames (@nxtgms). Kalle Hiitola (@kallehiitola) the CTO and Co-Founder of Next Games will tell their story. Next Games is a Finnish game developer and publisher based in Helsinki. They develop free to play games for iOS and Android. You might know some other big names coming from the same region like Supercell and Rovio. With the help of Azure they have been able to create globally scalable games that have close to 17 million downloads!

The Walking Dead No Man's Land

NextGames will talk about how Azure and our NoSQL service (Azure DocumentDB) and how it powers their games. Biggest of their titles is The Walking Dead No Man’s Land with millions of installs. For a modern player everything needs to happen instantly or they lose their attention and to prevent that they needed the most powerful database they could get – Azure DocumentDB (@DocumentDB).

Integral Analytics Inc

Another amazing customer that will join me on the stage is Integral Analytics. Integral Analytics is a data science and analytics software company that provides core operational, planning, and valuation applications for utilities and other participants in the electric-energy industry. Integral Analytics’ proprietary products help the largest energy companies in the United States and Canada navigate the “grid edge” reliably and cost-efficiently. To perform its mission successfully, Integral Analytics needed a cloud provider that supports batch processing, real-time processing, easy development, and integrated advanced-analytics services. Come and find out directly from Integral Analytics about Azure SQL Data Warehouse (#AzureSqlDW).

Julie Koesmarno

Another amazing presenter that I will have with me on the stage will be Julie Koesmarno. Many of you know Julie very well as @MsSQLGirl. Julie will present an incredible demo called the “Big Cognition”. Software thus far has been “sterile” and unable as it is to understand or use human emotions, or combine it with anything else. Using our cutting-edge technology, you can now extract emotional sentiments from images, videos, speech and text, and you can do it in bulk. You can now join emotions from image content with any other type of data you have and do incredibly powerful analytics and intelligence over it. This is what I mean by Big Cognition. It’s not just extracting one piece of cognitive information at a time, not just about understanding an emotion or whether there’s an object in an image, but rather it’s about joining all the extracted cognitive data with other types of data, so you can do some real magic with it. You really don’t want to miss this demo by Julie.

Another partner I will be talking about at the keynote is eSmart Systems (@eSmart_Systems). I will talk about their Connected Drone project.

eSmart Systems

The objective of Connected Drone is to support inspections of power lines which, today, is performed either by ground crews walking miles and miles of power lines, or through dangerous helicopter missions to aerially monitor these lines (if there is one place you don’t want humans to be in helicopters, it’s over high power lines). With Connected Drones, eSmart uses deep learning to automate as much of the inspection process as possible. As they fly over power lines, the drones stream live data through Azure for analytics. eSmart Systems uses different types of neural networks including deep neural networks (DNNs) to do this. They analyze both still images and videos from the drones and are able to recognize objects in real time. I will tell you how the combination of our powerful database engines with Deep Learning can enable previously unthought-of scenarios and products.

Jen Stirrup

In conclusion, I will have a SQL Server MVP, Jen Stirrup (@jenstirrup) join me on the stage. Jen is very well-known in the SQL PASS community and is also a PASS board member. Jen wanted me to keep her demo a surprise… You’ll just have to watch it. All I’ll say is just two things: it’s something that’s never been done at PASS Summit before and secondly I am really excited to have her present with me in the keynote. In the end, this demo might be all people will be talking about after the keynote.

SQL PASS is a must-watch, must-see, must do event! I look forward to see you all there next week.


Categories: Database

5 Advantages of Load Balancing For IT Companies

Database Journal News - Thu, 10/20/2016 - 08:01

As an IT company, it is crucial that you maintain uptime and monitor the performance of various processes. By making database load balancing a key element, you can ensure that your processes and websites run smoothly.

Categories: Database

Free eBook: Using SQL Server 2016 for Data Science & Advanced Analytics

Reposted from the Cortana Intelligence & Machine Learning blog.

Data Science with Microsoft SQL Server 2016

The world around us – every business and nearly every industry – is being transformed by technology. SQL Server 2016 was built for this new world and to help businesses get ahead of today’s disruptions. With this free eBook, you will learn how to install, configure and use Microsoft’s SQL Server R Services in your data science and advanced analytics projects.

Read the Foreword to the eBook and download it from here.

Categories: Database

PostgresDAC 3.2.0 with PostgreSQL 9.6 and Android support is out

PostgreSQL News - Tue, 10/18/2016 - 01:00
PostgresDAC is a direct access component suite for RAD Studio and PostgreSQL

New milestone PostgresDAC release is out! Now with PostgreSQL 9.6 and Android target platform support.

From now and on NUMERIC fields are mapped to TFmtBcdField, that allows manipulating this type of values with immense flexibility.

This release contains breakdown change, thus developers should check their sources where NUMERIC fields are used.

Full changelog:
  • [!] Android target platform support added
  • [!] v9.6.0 client libraries added
  • [!] v9.6.0 dump & restore libraries (pg_dump.dll, pg_restore.dll) added
  • [!] NUMERIC fields are mapped to TFmtBcdField from now
  • [+] TPSQLRestore.SchemaNames property introduced to specify multiple schemas
  • [+] doStrictNames option added to TPSQLDump.Options
  • [+] roStrictNames option added to TPSQLRestore.Options
  • [*] TPSQLRestore.TableNames property to match all types of relations, not only plain tables
  • [*] In TPSQLDump locally-made changes in privilege assignments for system objects
  • [*] TPSQLRestore.SchemaName property removed as deprecated, use SchemaNames instead
  • [-] Assigning MEMO fields to empty string may cause memory leak

You're welcome to download the PostgresDAC v3.2.0 right now at: or login to your private area on our site at


Please don't hesitate to ask any questions or report bugs with our Support Ticketing system available at

Categories: Database, Open Source

SQL Server as a Machine Learning Model Management System

This post was authored by Rimma Nehme, Technical Assistant, Data Group

Machine Learning Model Management

If you are a data scientist, business analyst or a machine learning engineer, you need model management – a system that manages and orchestrates the entire lifecycle of your learning model. Analytical models must be trained, compared and monitored before deploying into production, requiring many steps to take place in order to operationalize a model’s lifecycle. There isn’t a better tool for that than SQL Server!

SQL Server as an ML Model Management System

In this blog, I will describe how SQL Server can enable you to automate, simplify and accelerate machine learning model management at scale – from build, train, test and deploy all the way to monitor, retrain and redeploy or retire. SQL Server treats models just like data – storing them as serialized varbinary objects. As a result, it is pretty agnostic to the analytics engines that were used to build models, thus making it a pretty good model management tool for not only R models (because R is now built-in into SQL Server 2016) but for other runtimes as well.

SELECT * FROM [dbo].[models]

Machine Learning model is just like data inside SQL Server

Figure 1: Machine Learning model is just like data inside SQL Server.

SQL Server approach to machine learning model management is an elegant solution. While there are existing tools that provide some capabilities for managing models and deployment, using SQL Server keeps the models “close” to data, thus leveraging all the capabilities of a Management System for Data to be now nearly seamlessly transferrable to machine learning models (see Figure 2). This can help simplify the process of managing models tremendously resulting in faster delivery and more accurate business insights.

Publishing Intelligence To Where Data Lives

Figure 2: Pushing machine learning models inside SQL Server 2016 (on the right), you get throughput, parallelism, security, reliability, compliance certifications and manageability, all in one. It’s a big win for data scientists and developers – you don’t have to build the management layer separately. Furthermore, just like data in databases can be shared across multiple applications, you can now share the predictive models.  Models and intelligence become “yet another type of data”, managed by the SQL Server 2016.

Why Machine Learning Model Management?

Today there is no easy way to monitor, retrain and redeploy machine learning models in a systematic way. In general, data scientists collect the data they are interested in, prepare and stage the data, apply different machine learning techniques to find a best-of-class model, and continually tweak the parameters of the algorithm to refine the outcomes. Automating and operationalizing this process is difficult. For example, a data scientist must code the model, select parameters and a runtime environment, train the model on batch data, and monitor the process to troubleshoot errors that might occur. This process is repeated iteratively on different parameters and machine learning algorithms, and after comparing the models on accuracy and performance, the model can then be deployed.

Currently, there is no standard method for comparing, sharing or viewing models created by other data scientists, which results in siloed analytics work. Without a way to view models created by others, data scientists leverage their own private library of machine learning algorithms and datasets for their use cases. As models are built and trained by many data scientists, the same algorithms may be used to build similar models, particularly if a certain set of algorithms is common for a business’s use cases. Over time, models begin to sprawl and duplicate unnecessarily, making it more difficult to establish a centralized library.

Why SQL Server 2016 for machine learning model management

Figure 3: Why SQL Server 2016 for machine learning model management.

In light of these challenges, there is an opportunity to improve model management.

Why SQL Server 2016 for ML Model Management?

There are many benefits to using SQL Server for model management. Specifically, you can use SQL Server 2016 for the following:

  • Model Store and Trained Model Store: SQL Server can efficiently store a table of “pre-baked” models of commonly used machine learning algorithms that can be trained on various datasets (already present in the database), as well as trained models for deployment against a live stream for real-time data.
  • Monitoring service and Model Metadata Store: SQL Server can provide a service that monitors the status of the machine learning model during its execution on the runtime environment for the user, as well as any metadata about its execution that is then stored for the user.
  • Templated Model Interfaces: SQL Server can store interfaces that abstract the complexity of machine learning algorithms, allowing users to specify the inputs and outputs for the model.
  • Runtime Verification (for External Runtimes): SQL Server can provide a runtime verification mechanism using a stored procedure to determine which runtime environments can support a model prior to execution, helping to enable faster iterations for model training.
  • Deployment and Scheduler: Using SQL Server’s trigger mechanism, automatic scheduling and an extended stored procedure you can perform automatic training, deployment and scheduling of models on runtime environments, obviating the need to operate the runtime environments during the modeling process.

Here is the list of specific capabilities that makes the above possible:

ML Model Performance:
  • Fast training and scoring of models using operational analytics (in-memory OLTP and in-memory columnstore).
  • Monitor and optimize model performance via Query store and DMVs. Query store is like a “black box” recorder on an airplane. It records how queries have executed and simplifies performance troubleshooting by enabling you to quickly find performance differences caused by changes in query plans. The feature automatically captures a history of queries, plans, and runtime statistics, and retains these for your review. It separates data by time windows, allowing you to see database usage patterns and understand when query plan changes happened on the server.
  • Hierarchical model metadata (that is easily updateable) using native JSON support: Expanded support for un-structured JSON data inside SQL Server enables you to store properties of your models using JSON format. Then you can process JSON data just like any other data inside SQL. It enables you to organize collections of your model properties, establish relationships between them, combine strongly-typed scalar columns stored in tables with flexible key/value pairs stored in JSON columns, and query both scalar and JSON values in one or multiple tables using full Transact-SQL. You can store JSON in In-memory or Temporal tables, you can apply Row-Level Security predicates on JSON text, and so on.
  • Temporal support for models: SQL Server 2016’s temporal tables can be used for keeping track of the state of models at any specific point in time. Using temporal tables in SQL Server you can: (a) understand model usage trends over time, (b) track model changes over time, (c) audit all changes to models, (d) recover from accidental model changes and application errors.
ML Model Security and Compliance:
  • Sensitive model encryption via Always Encrypted: Always Encrypted can protect model at rest and in motion by requiring the use of an Always Encrypted driver when client applications to communicate with the database and transfer data in an encrypted state.
  • Transparent Data Encryption (TDE) for models. TDE is the primary SQL Server encryption option. TDE enables you to encrypt an entire database that may store machine learning models. Backups for databases that use TDE are also encrypted. TDE protects the data at rest and is completely transparent to the application and requires no coding changes to implement.
  • Row-Level Security enables you to protect the model in a table row-by-row, so a particular user can only see the models (rows) to which they are granted access.
  • Dynamic model (data) masking obfuscates a portion of the model data to anyone unauthorized to view it. Return masked data to non-privileged users (e.g. credit card numbers).
  • Change model capture can be used to capture insert, update, and delete activity applied to models stored in tables in SQL Server, and to make the details of the changes available in an easily consumed relational format. The change tables used by change data capture contain columns that mirror the column structure of a tracked source table, along with the metadata needed to understand the changes that have occurred.
  • Enhanced model auditing. Auditing is an important mechanism for many organizations to serve as a checks and balances.  In SQL Server 2016 are there any new Auditing features to support model auditing. You can implement user-defined audit, audit filtering and audit resilience.
ML Model Availability:
  • AlwaysOn for model availability and champion-challenger. An availability group in SQL Server supports a failover environment. An availability group supports a set of primary databases and one to eight sets of corresponding secondary databases. Secondary databases are not backups. In addition, you can have automatic failover based on DB health. One interesting thing about availability groups in SQL Server with readable secondaries is that they enable “champion-challenger” model setup. The champion model runs on a primary, whereas challenger models are scoring and being monitored on the secondaries for accuracy (without having any impact on the performance of the transactional database). Whenever a new champion model emerges, it’s easy to enable it on the primary.
ML Model Scalability
  • Enhanced model caching can facilitate model scalability and high performance. SQL Server enables caching with automatic, multiple TempDB files per instance in multi-core environments.

In summary, SQL Server delivers the top-notch data management with performance, security, availability, and scalability built into the solution. Because SQL Server is designed to meet security standards, it has minimal total surface area and database software that is inherently more secure. Enhanced security, combined with built-in, easy-to-use tools and controlled model access can help organizations meet strict compliance policies. Integrated high availability solutions enable faster failover and more reliable backups – and they are easier to configure, maintain, and monitor, which helps organizations reduce the total cost of model management (TCMM). In addition, SQL Server supports complex data types and non-traditional data sources, and it handles them with the same attention – so data scientist can focus on improving the model quality and outsource all of the model management to SQL Server.


Using SQL Server 2016 you can do model management with ease. SQL Server is unique from other machine learning model management tools, because it is a database engine, and is optimized for data management. The key insight here is that “models are just like data” to an engine like SQL Server, and as such we can leverage most of the mission-critical features of data management built into SQL Server for machine learning models. Using SQL Server for ML model management, an organization can create an ecosystem for harvesting analytical models, enabling data scientists and business analysts to discover the best models and promote them for use. As companies rely more heavily on data analytics and machine learning, the ability to manage, train, deploy and share models that turn analytics into action-oriented outcomes is essential.


Categories: Database

Developers: Don’t miss this PASS Summit pre-conference session

The new era of data driven and intelligent applications is here – powered by a robust data platform and a secure cloud infrastructure. The cloud gives us the power to collect and integrate data from an enormous variety of sources, to process big data at amazing scale and economics, to dramatically simplify development and deployment, and offer amazing intelligent APIs and applications as hosted services.

Microsoft’s data platform and Azure data services are packed with tons of features to make apps more customer-friendly, more secure, more performant, more intelligent and most of all – easier to develop! From the security benefits of Dynamic Data Masking, built-in intelligence/analytics, to the performance improvements of in-memory OLTP, Microsoft’s data platform and Azure provides developers with the tools they need to develop modern, smarter and Intelligent applications.

Sound good, right?  Join Microsoft data experts Tobias Ternstrom, Lindsey Allen, and many more at the PASS Summit pre-conference session, Taking A Hands-On Approach to Building Intelligence into Data Driven Applications.  Bring your device to this hands-on session and get a full day of app building and exploring these features in-depth with Microsoft Product Engineering walking you through each step. The data for development immersion program includes 15 labs based on real-life scenarios developers face every day.

At the end of the day you will walk away with sample code and knowledge on how to create your own data-driven intelligence application.

Categories: Database

Basic DB2 Application Tuning Strategies

Database Journal News - Mon, 10/17/2016 - 08:01

Assume you’ve been given a directive by IT management to "tune" things. What strategies are available and how do you judge which ones are best? Considering the current economic climate, can any be implemented on a shoestring budget? Can any of these tasks be assigned to resources with basic DBA skills and still be productively accomplished?

Categories: Database

SQL Server 2016 Express Edition in Windows containers

We are excited to announce the public availability of SQL Server 2016 Express Edition in Windows Containers! The image is now available on Docker Hub and the build scripts are hosted on our SQL Server Samples GitHub repository. This image can be used in both Windows Server Containers as well as Hyper-V Containers.

SQL Server 2016 Express Edition Docker Image | Installation Scripts

We hope you will find these images useful and leverage them for your container-based applications!

Why use SQL Server in containers?

SQL Server 2016 in a Windows container would be ideal when you want to:

  1. Quickly create and start a set of SQL Server instances for development or testing.
  2. Maximize density in test or production environments, especially in microservice architectures.
  3. Isolate and control applications in a multi-tenant infrastructure.

Before you can get started with the SQL Server 2016 Express Edition image, you’ll need a Windows Server 2016 or Windows 10 host with the latest updates, the Windows Container feature enabled, and the Docker engine.

Please find the details for each of these requirements below.

  • Get a Windows Server 2016 or Windows 10 host
    • Windows Server 2016: You can start by downloading an evaluation copy from the TechNet Evaluation Center. Please make sure that all the latest Windows updates are installed, most importantly KB3176936 and KB3192366.
    • Windows 10: You will need Windows 10 Anniversary Edition Professional or Enterprise. Note: if you are on the Windows Insider builds, make sure that you are using build 14942.1000 or higher to avoid an issue with the Docker run command in older builds.
  • Enable the Windows Container feature and install the Docker Engine
Pulling and Running SQL Server 2016 in a Windows Container

Below are the Docker pull and run commands for running SQL Server 2016 Express instance in a Windows Container.

Make sure that the mandatory sa_password environment variable meets the SQL Server 2016 Password Complexity requirements.

First, pull the image

docker pull microsoft/mssql-server-2016-express-windows

Then, run a SQL Server container

Running a Windows Server Container (Windows Server 2016 only):

docker run -d -p 1433:1433 ––env sa_password=<YOUR_PWD> microsoft/mssql-server-2016-express-windows

Running a Hyper-V Container (Windows Server 2016 or Windows 10):

docker run -d -p 1433:1433 ––env sa_password=<YOUR_PWD> ––isolation=hyperv microsoft/mssql-server-2016-express-windows

Connecting to SQL Server 2016 From within the container

An easy way to connect to the SQL Server instance from inside the container is by using the sqlcmd utility.

First, use the docker ps command to get the container ID that you want to connect to and use it to replace the parameter placeholder ‘<DOCKER_CONTAINER_ID>’ in the commands below. You can use the docker exec -it command to create an interactive command prompt that will execute commands inside of the container.

You can connect to SQL Server by using either Windows or SQL Authentication.

Windows authentication using container administrator account

docker exec -it <DOCKER_CONTAINER_ID> sqlcmd

SQL authentication using the system administrator (SA) account

docker exec -it <DOCKER_CONTAINER_ID> sqlcmd -S. -Usa

From outside the container

One of the ways to access SQL Server 2016 from outside the container is by installing SQL Server Management Studio (SSMS). You can install and use SSMS either on the host or on another machine that can remotely connect to the host .

Connect from SSMS installed on the host

To connect from SSMS installed on the host, you’ll need the following information:

  • The IP Address of the container
    One of the ways to get the IP address of the container is by using the docker inspect command:
    docker inspect –format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}’ <DOCKER_CONTAINER_ID>
  • The SQL Server port number
    This is the same port number that was specified in the docker run command. If you used 1433 you don’t need to specify the port. If you want to specify a port to connect to you can add it to the end of the server name like this: myserver,1433.
  • SQL system administrator account credentials
    The username is ‘sa’ and the sa_password that was used in the docker run command.

Connect from SSMS on another machine (other than the Host Environment)

To connect from SSMS installed on another machine (that can connect to the host), you’ll need the following information:

  • The IP address of the host
    You can get the host’s IP address by using the ipconfig command from a PowerShell or command prompt window.
  • The SQL Server port number
    This is the same port that was specified in the docker run command. If you used 1433 you don’t need to specify the port. If you want to specify a port to connect to you can add it to the end of the server name like this: myserver,1433.
    Note: Depending on your configuration, you might have to create a firewall rule to open the necessary SQL Server ports on the host. Please refer to this article for more information regarding container networking.
  • SQL system administrator account credentials
    The username is ‘sa’ and the sa_password that was used in the docker run command.
SQL 2016 Features Supported on Windows Server Core

Please refer to this link for all SQL Server 2016 features that are supported on a Windows Server Core installation.

Developing Using Windows 10 Containers

Check out this blog post by Alex Ellis, Docker Captain, on how to use SQL Server 2016 Express Edition in a Windows container as part of an application development and test environment on Windows 10.

Docker with Microsoft SQL 2016 + ASP.NET

Further Reading

Windows Containers Documentation
Container Resource Management
SQL Server 2016 GitHub Samples Repo
Tutorials for SQL Server 2016

Categories: Database

A New Way To Pattern Match In Oracle 12c

Database Journal News - Thu, 10/13/2016 - 08:01

Oracle 12c offers a new way to match patterns in your data, the MATCH_RECOGNIZE function.  Read on to see the syntax and the function in action.

Categories: Database

Court Orders Rimini Street to Stop Unlawful Conduct and Awards Oracle $27.7 Million in Prejudgment Interest

Oracle Database News - Wed, 10/12/2016 - 20:00
Press Release Court Orders Rimini Street to Stop Unlawful Conduct and Awards Oracle $27.7 Million in Prejudgment Interest

Redwood Shores, Calif.—Oct 12, 2016

Yesterday, the United States District Court for the District of Nevada issued two significant rulings in Oracle’s litigation against Rimini Street and its CEO Seth Ravin.

The first ruling is a permanent injunction barring Rimini Street from continuing to infringe Oracle’s copyrights and other unlawful acts. The Court previously determined that a permanent injunction was warranted, noting Rimini Street’s “callous disregard for Oracle’s copyrights and computer systems when it engaged in the infringing conduct” and that “Rimini’s business model was built entirely on its infringement of Oracle’s copyrighted software and its improper access and downloading of data from Oracle’s website and computer systems.”

The Court’s four-page permanent injunction prohibits certain copying, distribution, and use of Oracle’s copyrighted software and documentation by Rimini Street and also imposes limitations on Rimini Street’s access to Oracle’s websites. The order states, for example, that Rimini Street may not use a customer’s software environment “to develop or test software updates or modifications for the benefit of any other licensee.” The order also prohibits Rimini Street’s preparation of certain derivative works from Oracle’s copyrighted software. The order applies not only to Rimini Street, but also “its subsidiaries, affiliates, employees, directors, officers, principals, and agents.”

“This permanent injunction imposes important restrictions on Rimini Street,” said Oracle’s General Counsel Dorian Daley, “and Oracle is grateful that the Court has taken steps to prevent continuing unlawful acts by Rimini Street and its executives.”

Although Rimini Street has stated that there was “no expected impact” from any injunction, Rimini Street also told the Court that it “could suffer significant harm to its current business practices if the proposed injunction were entered,” which it now has been. These contradictory positions raise the issue of whether Rimini is misleading the Court or making misrepresentations to customers.

In its second ruling, the Court awarded Oracle approximately $27.7 million in prejudgment interest, with an additional amount of prejudgment interest to be awarded based on the date of the final judgment. This is in addition to the $50 million jury verdict and the $46.2 million in attorneys’ fees and costs awarded to Oracle last month.

Contact Info Deborah Hellinger
+1.212.508.7935 About Oracle

Oracle offers a comprehensive and fully integrated stack of cloud applications and platform services. For more information about Oracle (NYSE:ORCL), visit


Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.

Safe Harbor

The following is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into any contract. It is not a commitment to deliver any material, code, or functionality, and should not be relied upon in making purchasing decisions. The development, release, and timing of any features or functionality described for Oracle's products remains at the sole discretion of Oracle Corporation. 

Talk to a Press Contact

Deborah Hellinger

  • +1.212.508.7935

Follow Oracle Corporate

Categories: Database, Vendor

1,000,000 predictions per second

This post is by Joseph Sirosh, Corporate Vice President of the Data Group at Microsoft.

Transactional Workloads + Intelligence

Online transaction processing (OLTP) database applications have powered many enterprise use-cases in recent decades, with numerous implementations in banking, e-commerce, manufacturing and many other domains. Today, I’d like to highlight a new breed of applications that marry the latest OLTP advancements with advanced insights and machine learning. In particular, I’d like to describe how companies can predict a million events per second with the very latest algorithms, using readily available software. We have shown this demo at the Microsoft Machine Learning and Data Science Summit and my General Session at Ignite in Atlanta, Georgia. You can watch both online. The predictive model was based on a boosted decision tree algorithm with 50 trees and 33 features.

Machine Learning at 1M PPP

Take credit card transactions, for instance. These can trigger a set of decisions that are best handled with predictive models. Financial services companies need to determine whether a particular transaction is fraudulent or legitimate.

As the number of transactions per second (TPS) increase, so does the number of predictions per second (PPS) that organizations need to make. The Visa network, for instance, was capable of handling 56,000 TPS last year and managed over 100 billion yearly transactions. With each transaction triggering a set of predictions and decisions, modern organizations have a need for a powerful platform that combines OLTP with a high-speed prediction engine. We expect that an increasing number of companies will need to hit 1 million predictions per second (PPS) or more in coming years.

What kind of architecture would enable such use cases? At Microsoft, we believe that computing needs to take place where data lives. This minimizes data movement, eliminates the costs and security risks associated with data movement and the prediction engine sits close to the database (i.e., in-database analytics). Moreover, the predictive models can be shared by multiple applications. That’s precisely how SQL Server 2016 was designed.

Take the credit card fraud detection example I mentioned above – one can handle it in the following manner:

  • A data scientist creates a predictive model for credit-card fraud detection based on historical transaction data. This model is stored as a standard database object inside a database.
  • New credit-card transactions are ingested and stored in high-speed in-memory columnstores.
  • The data is likely to require some preparation for advanced analytics. This includes operations such as joining data across multiple tables, cleansing, creating aggregations and more. SQL shines at this, because these steps execute much faster in production when done at the database layer.
  • The new transaction data and the predictive model are sent (using T-SQL) to an in-database predictive engine. Predictions can then be done in batch or at the single transaction level. In SQL Server 2016 you can build on the power of R, with its extensive set of packages and the built-in high scale algorithmic library (ScaleR) provided by Microsoft.
  • Predictions can be retuned immediately to an application via T-SQL and/or stored in the database for further use.

This is shown visually below:

Fraud Predictions Visual

The above architecture is very versatile. In addition to using it in fraud detection, we’ve applied this architecture to perform what-if analysis on an auto loan dataset.

Analytical Workloads + Intelligence

Imagine a loan application where a financial services company needs to determine if a loan will be repaid on time. Similarly to predicting fraudulent transactions, you can leverage SQL Server 2016 as a Scoring Engine to predict “bad” loans. Loans that indicate good repayment behavior are considered “good” and loans that indicate less than perfect repayment behavior are considered “bad”. Imagine scanning through millions of loan applications and being able to predict – within seconds – which loans will default. Now imagine a business analyst launching the same exercise while modeling a scenario where the Federal Reserve increases interest rates. Our loan default prediction model was able to reach and exceed a staggering 1,250,000 predictions per second, completing the what-if analysis within 15-18 seconds. This capability now enables our customers to have near real-time predictive analytics. The architecture is shown visually below:

Loan Default Prediction Visual

One of the common tasks from customers is to provide an intelligent method of predicting how changing factors like interest rates, loan terms or even a member’s credit score would affect the charge-off probability. You can specify a what-if input for an increased interest rate and score the open loans with the new proposed interest rate using parallel threads which call a SQL Server stored procedure to invoke the scoring model on the open loans. You can take these predictions and compare the base predictions with the what-if predictions. Then you can study the probability of HIGH charge-offs increasing with an increase in interest rate and how it may effect various branches of your business. Such near real-time predictive analytics capabilities minimize research bias, dramatically increase business flexibility and focus on attributes that matter which results in higher profitability.

At Ignite, we had Jack Henry & Associates on the stage with me. They provide more than 300 products and services to over 10,000 credit unions and enable them to process financial transactions plus automate their services. Using SQL Server as a Scoring Engine, enabled their vision of building an intelligent enterprise data warehouse which would help their customers increase their productivity. They have been working with Microsoft to leverage SQL Server with built-in R capability to build intelligence into their current data warehousing service portfolio. Such an intelligent data warehouse helps credit unions and financial services become more flexible and react to situations in a data-driven manner. We see opportunities in applying such models within the database to customer churn predictions, predicting loan portfolio changes and a host of other scenarios. Several banking and insurance companies rely on very complex architectures to do predictive analytics and scoring today. Using the architecture outlined in this blog, businesses can do this in a dramatically simpler and faster manner.

The possibilities are endless.

SQL Server as a Scoring Engine

We’ve posted several samples on GitHub. The available templates are listed below.

  • Predictive Maintenance. Predict machine failures.
  • Customer Churn Prediction. Predict when a customer churn happens.
  • Online Purchase Fraud Detection. Predict if an online purchase transactions is fraudulent.
  • Energy Demand Forecasting. Forecast electricity demand of multiple regions.
  • Retail Forecasting. Forecast the product sales for a retail store.
  • Campaign Management. Predict when and how to contact potential customers.
  • Predicting Risk on Consumer Loans is posted here.

This is how companies are predicting at the speed of data, today.


Categories: Database

PASS Summit 2016: world’s biggest SQL Server event

PASS Summit 2016 is nearly upon us. With only 4 weeks until the big event, now is the time to register!

PASS Summit 2016 is community-driven event with three days of technical sessions, networking, and professional development. Don’t miss your chance to stay up to date on the latest and greatest Microsoft data solutions along with 4,000 of your data professional and developer peers.

What’s new this year?  So many things!

• PASS Summit is not just for DBAs.  With nearly 1,000 developers attending the event, Microsoft has increased the number of sessions focused on application development and developer tools by 60%.

• While many people attend PASS Summit to grow fundamental database skills, we know that many attendees are very experienced, senior data professionals so we increased the number of deep technical sessions by half.

• We have also added a new type of session called a Chalk Talk. These are Level 500 sessions with Microsoft senior program management hosting open Q&A in a collegiate style setting.  Seating is limited to 50 so you’ll want to get there early to claim your spot.

In addition to these enhancements, Microsoft has also increased investment in sending employees onsite to talk with attendees.  They’ll be easy to spot – all 500 Microsoftees will be wearing bright fuchsia t-shirts.  You can find them in big numbers the Day 1 keynote, Microsoft booth, SQL Clinic, Wednesday’s Birds of a Feather luncheon, Thursday’s WIT luncheon, and of course in our big booth in the Expo Hall.

Have a technical challenge or need architecture advice?

SQL Clinic is the place to be. SQL Clinic is the hub of technical experts from SQLCAT, Tiger Team, CSS, and others. Whether you are looking for SQL Server deployment support, have a troublesome technical issue, or developing an application the experts at SQL Clinic will have the right advice for you.

Click here to register today!

Are you a member of a PASS chapter or virtual chapter?  If so, remember to take advantage of the $150 discount code.  Contact your chapter leader for details.

Sending your whole team? There is also a great group discount for companies sending five or more employees.

Once you get a taste for the learning and networking waiting for you at PASS Summit, we invite you to join the conversation by following @SQLServer on Twitter as well as @SQLPASS and #sqlsummit. We’re looking forward to an amazing event, and can’t wait to see everyone there!

Stay tuned for regular updates and highlights on Microsoft and PASS activities planned for this year’s conference.

Categories: Database

Top 6 MySQL DBA Mistakes

Database Journal News - Mon, 10/10/2016 - 08:01

To err is human, or so the saying goes. Unfortunately, in the world of IT, innocuous mistakes made early on can lead to really daunting problems down the road. In today’s article, we’ll go over a few of the most common DBA mistakes to avoid. Some of these relate specifically to MySQL, while others may be applicable to any RDBMS.

Categories: Database

Oracle Announces Final Extension of Tender Offer for NetSuite to Expire on November 4

Oracle Database News - Fri, 10/07/2016 - 13:05
Press Release Oracle Announces Final Extension of Tender Offer for NetSuite to Expire on November 4

Redwood Shores, Calif.—Oct 7, 2016

Oracle Corporation (NYSE: ORCL) announced today that it has extended the expiration of its tender offer for the acquisition of NetSuite Inc. (NYSE: N) to Friday, November 4, 2016. This will be the final extension that Oracle is obligated to make under the merger agreement. In the event that a majority of NetSuite’s unaffiliated shareholders do not tender sufficient shares to reach the minimum tender condition, Oracle will respect the will of NetSuite’s unaffiliated shareholders and terminate its proposed acquisition.

Napa Acquisition Corporation, a subsidiary of OC Acquisition LLC, a subsidiary of Oracle Corporation, has extended its all-cash tender offer for $109.00 per share for all of the issued and outstanding shares of common stock, par value of $0.01 per share (the “Shares”), of NetSuite Inc. to 12:00 Midnight, Eastern time, at the end of Friday, November 4, 2016. Per the terms of the merger agreement, a majority of the unaffiliated Shares, or 20,403,928 out of 40,807,854 total unaffiliated Shares, must be tendered in order for the minimum condition to be satisfied and the tender offer to be completed. In aggregate, there are approximately 81,474,150 Shares issued and outstanding as of October 5, 2016.

American Stock Transfer & Trust Company LLC, the depositary for the tender offer, has indicated that as of 12:00 Midnight, Eastern time, at the end of October 6, 2016, approximately 4,568,498 unaffiliated Shares, or 11.2% of the total unaffiliated Shares, and 45,084,266 total Shares, or 55.3% of the total Shares issued and outstanding, have been tendered into and not properly withdrawn from the tender offer. Both figures include 293,328 Shares tendered pursuant to the guaranteed delivery procedures set forth in the Offer to Purchase.

Contact Info Deborah Hellinger
+1.212.508.7935 About Oracle

Oracle offers a comprehensive and fully integrated stack of cloud applications and platform services. For more information about Oracle (NYSE:ORCL), visit


Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.

Additional Information

This communication does not constitute an offer to buy or solicitation of an offer to sell any securities. This communication is for informational purposes only. The tender offer is not being made to, nor will tenders be accepted from, or on behalf of, holders of shares in any jurisdiction in which the making of the tender offer or the acceptance thereof would not comply with the laws of that jurisdiction. The tender offer is being made pursuant to a tender offer statement on Schedule TO (including the Offer to Purchase, a related Letter of Transmittal and other offer materials) filed by Napa Acquisition Corporation with the U.S. Securities and Exchange Commission ("SEC") on August 18, 2016, as amended from time to time. In addition, on August 18, 2016, NetSuite Inc. filed a Solicitation/Recommendation statement on Schedule 14D-9 with the SEC related to the tender offer. Stockholders of NetSuite Inc. are urged to read these documents, all amendments thereto and other documents filed with the SEC carefully in their entirety because they contain important information about the tender offer. The tender offer statement and certain other offer documents, along with the Solicitation/Recommendation statement, will be made available to all stockholders of NetSuite Inc. at no expense to them. These documents are available at no charge through the web site maintained by the SEC at The Offer to Purchase, related Letter of Transmittal, the Solicitation/Recommendation statement and other offering documents may also be obtained for free by contacting the Information Agent for the tender offer, Innisfree, M&A Incorporated, toll-free at 888-750-5834.

Cautionary Statement Regarding Forward-Looking Statements

This document contains certain forward-looking statements about Oracle and NetSuite Inc., including statements that involve risks and uncertainties concerning Oracle’s proposed acquisition of NetSuite Inc., anticipated customer benefits and general business outlook. When used in this document, the words "anticipates", "can", "will", "look forward to", "expected" and similar expressions and any other statements that are not historical facts are intended to identify those assertions as forward-looking statements. Any such statement may be influenced by a variety of factors, many of which are beyond the control of Oracle or NetSuite Inc., that could cause actual outcomes and results to be materially different from those projected, described, expressed or implied in this document due to a number of risks and uncertainties. Potential risks and uncertainties include, among others, the possibility that the transaction will not close or that the closing may be delayed, the anticipated synergies of the combined companies may not be achieved after closing, the combined operations may not be successfully integrated in a timely manner, if at all, general economic conditions in regions in which either company does business, and the possibility that Oracle or NetSuite Inc. may be adversely affected by other economic, business, and/or competitive factors. Accordingly, no assurances can be given that any of the events anticipated by the forward-looking statements will transpire or occur, or if any of them do so, what impact they will have on the results of operations or financial condition of Oracle or NetSuite Inc. You are cautioned to not place undue reliance on forward-looking statements, which speak only as of the date of this document. Neither Oracle nor NetSuite Inc. is under any duty to update any of the information in this document.

In addition, please refer to the documents that Oracle and NetSuite Inc., respectively, file with the SEC on Forms 10-K, 10-Q and 8-K. These filings identify and address other important factors that could cause Oracle’s and NetSuite Inc.’s respective operational and other results to differ materially from those contained in the forward-looking statements set forth in this document. You are cautioned to not place undue reliance on forward-looking statements, which speak only as of the date of this document. Neither Oracle nor NetSuite Inc. is under any duty to update any of the information in this document.

Safe Harbor

The preceding is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into any contract. It is not a commitment to deliver any material, code, or functionality, and should not be relied upon in making purchasing decisions. The development, release, and timing of any features or functionality described for Oracle's products remains at the sole discretion of Oracle Corporation. 

Talk to a Press Contact

Deborah Hellinger

  • +1.212.508.7935

Follow Oracle Corporate

Categories: Database, Vendor

Azure SQL Database - Implementing Azure AD Authentication

Database Journal News - Thu, 10/06/2016 - 08:01

Marcin Policht focuses on the specifics of using Azure Active Directory to authenticate to Azure SQL Database.

Categories: Database

repmgr 3.2 released

PostgreSQL News - Thu, 10/06/2016 - 01:00

2ndQuadrant is proud to announce the release of repmgr version 3.2 which includes new features for Barman, replication cluster managing, standby cloning improvements, better integration of OS-level server commands, and other minor commands and improvements.

This release leverages improvements included with PostgreSQL 9.6 to provide support for switchover, a common administrative task where the roles of primary and standby server are swapped in a controlled manner to facilitate software and hardware maintenance.

Note: repmgr 3.2 supports PostgreSQL 9.3 and later only.

What is repmgr?

repmgr is an open source package which greatly simplifies the process of setting up and managing replication using streaming replication within a cluster of PostgreSQL servers. Taking advantage of features introduced in PostgreSQL releases since 9.3, repmgr builds on PostgreSQL's core functionality to provide a smooth, user-friendly way of working with the complexities of replication and failover management in high-availability environments.

New features in repmgr 3.2

Barman Support

repmgr 3.2 provides enhanced integration with 2ndQuadrant's Barman backup and recovery manager. In particular, it is now possible to clone a standby from a Barman archive, rather than directly from a running database server. This means the server is not subjected to the I/O load caused by a backup, and there's no need to manage WAL retention on the database server. The standby can also use the Barman archive as a fallback source of WAL in case streaming replication is interrupted.

Replication Cluster Monitoring

Usually, when working with more complex replication clusters spread over multiple locations, gaining an overview of connection status between nodes can be a tedious and time-consuming process. repmgr 3.2 provides two new commands which help with this, expanding the existing “repmgr cluster show”. You can read in depth about these commands in the README file.

Standby Cloning Improvements

In addition to the ability to clone directly from a Barman archive, various other improvements have been made to the standby cloning process. In particular, by default password-less SSH connection between servers is now optional and only needs to be provided for specific features.

Better Integration of OS Level Server Commands

By default, repmgr.conf will use PostgreSQL's standard pg_ctl utility to control a running PostgreSQL server. However, it may be better to use the operating system's service management system. To specify which service control commands are used, the following repmgr.conf configuration settings are available:

  • service_start_command
  • service_stop_command
  • service_restart_command
  • service_reload_command
  • service_promote_command

Read the formal release notes for information on other new commands, options, and improvements included in this release as well as upgrade instructions.

2ndQuadrant provides professional support services for repmgr — for more information, visit

Categories: Database, Open Source

Businesses Take Note: Less Than Half of Employees Say They Have the Right Technology to Do Their Jobs

Oracle Database News - Tue, 10/04/2016 - 13:00
Press Release Businesses Take Note: Less Than Half of Employees Say They Have the Right Technology to Do Their Jobs New Oracle Global Engagement Study states that digital technology, leadership and company values rank among top distinguishers for employee success

Redwood Shores, Calif.—Oct 4, 2016

Oracle Corporation (NYSE: ORCL) released a Global Engagement Study conducted by global market research company Kantar TNS, today including feedback from nearly 5,000 full-time employees at organizations with 250 or more employees. The study indicated that equipping employees with the latest technology and having accessible leaders and strong company values are important factors that can ultimately reflect an employee’s success or failure within the company.

Digital Enablement

Technology is enabling companies to connect with employees in more ways than ever to create a more modern and customized learning experience. However, findings from the study indicate low marks when it comes to companies capitalizing on this—only 44 percent of respondents say that their company uses the latest technology to enable them to effectively perform in their role.

“Employees as consumers are more plugged into technology via multiple devices (i.e. mobile, tablet, desktop) than ever before. They expect the same level of accessibility that they get in their personal lives with the technology they encounter at work,” said Gretchen Alarcon, group vice president of HCM Product Strategy, Oracle. “Cloud technology enables this digital employee experience through features that can help employees learn, process and consume information in an easy way. A more modern user experience interface also reflects what they are accustomed to at home.”


The findings also express that a strong presence in leadership is the backbone for an employee feeling satisfied and engaged. The study indicates that productivity starts with onboarding: employees are unsatisfied with the process, as only 41 percent believe that company onboarding practices set them up for growth and success. Not only are managers the first impressions of a company during onboarding, but they are also the first example of direction for the new worker. Only 47 percent of those polled viewed their leaders as visible and approachable and only 44 percent expressed that they have confidence in their leadership, indicating a lack of partnership between management and employees. 

“Employees ultimately decide if they would like to stay with a company within the first two weeks of employment,” said Alarcon. “What this means, is that within the first 14 days, employees are already asking themselves, ‘Do I think I can progress here? Do I have a manager who can be a mentor and am I getting the ability to create a network and get introduced to the right people and tools to best perform in my job?’ This is especially important when we think about development within a company—candidates want to feel the company is a good fit.”

Additionally, this study included key indicators for a healthy leadership to employee relationship:

  • Set examples of how best to communicate with those working under you
  • Remain extremely accessible so that people feel connected to company goals
  • Be actively involved in the working lives of new employees from the day they start
  • Use technology and digital experiences to stay in touch with team members
Values, Culture & Reward

What’s more, companies are now seeing that like-minded values between the employee and the company play a huge role in employee engagement. Only an astonishing 38 percent say that their company is concerned about their overall well-being—indicating that working towards a personally rewarding goal, while still upholding individual values can carry even more weight than monetary compensation.

Company culture mirrors these sentiments as employees are most comfortable and productive within a creative, yet flexible workplace culture. While compensation is often seen to be connected to employee satisfaction, we are now seeing that individuals are more interested in joining companies that uphold their same personal values to gauge if they are a good fit. Whether it’s a more flexible work schedule, more volunteer or health & wellness opportunities, companies too should now be listening and connecting with employees in more ways that correlate with their personal well-being.

To view the full study, click here.

For more information, view our infographic, checklist and blog post.

About the Study

Oracle collaborated with Kantar TNS on The Global Engagement Study which surveyed nearly 5,000 full-time employees at organizations with 250 or more employees. The study encompassed a total of 4,706 interviews that captured opinions, feelings and experiences on a wide range of workplace factors including: culture, performance, tangible and intangible rewards, development and progression opportunities, experiences with collaboration technology, the impact of leadership and how employees feel in their current role. Percentages quoted are for those giving a top two box selection on a seven point agreement scale.

Additional Information Contact Info Jennifer Yamamoto
+1.916.761.9555 About Oracle

Oracle offers a comprehensive and fully integrated stack of cloud applications and platform services. For more information about Oracle (NYSE:ORCL), visit


Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.

Safe Harbor

The preceding is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into any contract. It is not a commitment to deliver any material, code, or functionality, and should not be relied upon in making purchasing decisions. The development, release, and timing of any features or functionality described for Oracle's products remains at the sole discretion of Oracle Corporation. 

Talk to a Press Contact

Jennifer Yamamoto

  • +1.916.761.9555

Follow Oracle Corporate

Categories: Database, Vendor

pglogical 1.2 Now Available

PostgreSQL News - Tue, 10/04/2016 - 01:00

2ndQuadrant is proud to announce the release of pglogical 1.2 - now supporting PostgreSQL 9.6. Implemented entirely as a PostgreSQL extension, pglogical is a logical replication system that serves as a highly efficient method of replicating data as an alternative to physical replication.

In addition to a change in the way the output plugin and apply plugin are handled, some of the important feature updates and changes in this release include:

  • keepalive tuned to much smaller values by default so that pglogical will notice network issues earlier
  • Better compatibility when upgrading from PostgreSQL 9.4 - it's now safe to use the internal schema synchronization for that
  • Improved conflict handling
  • Improved support for replica triggers
  • Several issues with initial synchronizations fixed
  • Easier monitoring of the initial synchronization in pg_stat_replication

There are also several documentation improvements.

To find more information, including installation instructions, please visit the pglogical home page on the 2ndQuadrant website:

Categories: Database, Open Source

Windows Server 2016 and SQL Server 2016: Leveraging Hyper-V large-scale VM performance for in-memory transaction processing

This post was authored by Liang Yang, Principal Performance Engineer on the Hyper-V team and Jos de Bruijn, Senior Program Manager on the SQL Server team.

With Windows Server 2016, Microsoft has significantly bumped up the Hyper-V Virtual Machine (VM) scale limit to embrace new scenarios such as running e-commerce large in-memory databases for Online Transaction Processing (OLTP) and Data Warehousing (DW) purposes. In this post on the Windows Server Blog, we highlight the performance of in-memory transaction processing at scale using SQL Server 2016 running in a Windows Server 2016 Hyper-V VM.

Categories: Database