Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!
Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!
Today, organizations are spending a lot of time trying to simplify, align development with operations, and manage commodity or virtualized resources on-demand. This helps to reduce costs, improve efficiency and evolve solutions in a more agile fashion. Read on to learn more.
The pgAdmin Development Team are pleased to announce the release of pgAdmin 4 v1.1. This is the second release of pgAdmin 4, and includes over 40 bug fixes and improvements. For details, please see the issue tracker roadmap.
pgAdmin 4 will be bundled with the EDB PostgreSQL 9.6.1 installers, or can be downloaded in source, PIP wheel, macOS or Windows packages from the pgAdmin website.
For more details about DB Doc, visit the product page here.
The PostgreSQL Global Development Group has released an update to all supported versions of our database system, including 9.6.1, 9.5.5, 9.4.10, 9.3.15, 9.2.19, and 9.1.24. This is also the last update for the PostgreSQL 9.1 series as it is now end-of-life. This release fixes two issues that can cause data corruption, which are described in more detail below. It also patches a number of other bugs reported over the last three months. The project urges users to apply this update at the next possible downtime.WAL-logging of truncated relations
Prior to this release, there was a chance that a PostgreSQL instance would try to access data that no longer existed on disk. If the free space map was not updated to be aware of the truncation, a PostgreSQL database could return a page that was already truncated and produce an error such as:
ERROR: could not read block 28991 in file "base/16390/572026": read only 0 of 8192 bytes
If checksumming is enabled, checksum failures in the visibility map could also occur.
This issue is present in the 9.3, 9.4, 9.5, and 9.6 series of PostgreSQL releases.pg_upgrade issues on big-endian machines
On big-endian machines (e.g. many non-Intel CPU architectures), pg_upgrade would incorrectly write the bytes of the visibility map leading to pg_upgrade failing to complete.
If you are using a big-endian machine (many non-Intel architectures are big-endian) and have used pg_upgrade to upgrade from a pre-9.6 release, you should assume that all visibility maps are incorrect and need to be regenerated. It is sufficient to truncate each relation's visibility map with contrib/pg_visibility's pg_truncate_visibility_map() function. Please read the "Updating" section for post-installation instructions on how to resolve this issue on your PostgreSQL instances.
This issue is present only in the PostgreSQL 9.6.0 release.Bug Fixes and Improvements
In addition to the above, this update also fixes a number of bugs reported in the last few months. Some of these issues affect only the 9.6 series, but many affect all supported versions. There are more than 50 fixes provided in this release, including:
This update also contains tzdata release 2016h for DST law changes in Palestine and Turkey, plus historical corrections for Turkey and some regions of Russia. Switch to numeric abbreviations for some time zones in Antarctica, the former Soviet Union, and Sri Lanka.
The IANA time zone database previously provided textual abbreviations for all time zones, sometimes making up abbreviations that have little or no currency among the local population. They are in process of reversing that policy in favor of using numeric UTC offsets in zones where there is no evidence of real-world use of an English abbreviation. At least for the time being, PostgreSQL will continue to accept such removed abbreviations for timestamp input. But they will not be shown in the pg_timezone_names view nor used for output.
In this update, AMT is no longer shown as being in use to mean Armenia Time. Therefore, we have changed the Default abbreviation set to interpret it as Amazon Time, thus UTC-4 not UTC+4.EOL Notice for Version 9.1
PostgreSQL version 9.1 is now End-of-Life (EOL). No additional updates or security patches will be released by the community for this version. Users still on 9.1 are urged to upgrade as soon as possible. See our Versioning Policy (https://www.postgresql.org/support/versioning/) for more information.Updating
All PostgreSQL update releases are cumulative. As with other minor releases, users are not required to dump and reload their database or use pg_upgrade in order to apply this update release; you may simply shut down PostgreSQL and update its binaries.
If your system was affected by the big-endian pg_upgrade bug, please read Visibility Map Problems and follow the instructions on how to fix your this issue on your PostgreSQL instances.
Users who have skipped one or more update releases may need to run additional, post-update steps; please see the release notes for earlier versions for details.
This post is authored by Meet Bhagdev.
We are delighted to share the full release of the Microsoft ODBC Driver 13 for Linux âÂ (Ubuntu, RedHat and SUSE). The new driver enables access to SQL Server, Azure SQL Database and Azure SQL DW from any C/C++ application on Linux.Whatâs new
1. sudo su 2. sh -c 'echo "deb [arch=amd64] https://apt-mo.trafficmanager.net/repos/mssql-ubuntu-vivid-release/ vivid main" > /etc/apt/sources.list.d/mssqlpreview.list' 3. sudo apt-key adv --keyserver apt-mo.trafficmanager.net --recv-keys 417A0893 4. apt-get update 5. apt-get install msodbcsql 6. apt-get install unixodbc-dev-utf16 #this step is optional but recommended*Install the ODBC Driver for Linux on Ubuntu 15.10
1.Â sudo su 2.Â sh -c 'echo "deb [arch=amd64] https://apt-mo.trafficmanager.net/repos/mssql-ubuntu-wily-release/ wily main" > /etc/apt/sources.list.d/mssqlpreview.list' 3.Â sudo apt-key adv --keyserver apt-mo.trafficmanager.net --recv-keys 417A0893 4.Â apt-get update 5.Â apt-get install msodbcsql 6.Â apt-get install unixodbc-dev-utf16 #this step is optional but recommended*Install the ODBC Driver for Linux on Ubuntu 16.04
1.Â sudo su 2.Â sh -c 'echo "deb [arch=amd64] https://apt-mo.trafficmanager.net/repos/mssql-ubuntu-xenial-release/ xenial main" > /etc/apt/sources.list.d/mssqlpreview.list' 3.Â sudo apt-key adv --keyserver apt-mo.trafficmanager.net --recv-keys 417A0893 4.Â apt-get update 5.Â apt-get install msodbcsql 6.Â apt-get install unixodbc-dev-utf16 #this step is optional but recommended*Install the ODBC Driver for Linux on RedHat 6
1.Â sudo su 2.Â yum-config-manager --add-repo https://apt-mo.trafficmanager.net/yumrepos/mssql-rhel6-release/ 3.Â yum-config-manager --enable mssql-rhel6-release 4.Â wget "http://aka.ms/msodbcrhelpublickey/dpgswdist.v1.asc" 5.Â rpm --import dpgswdist.v1.asc 6.Â yum remove unixODBC #to avoid conflicts during installation 7.Â yum update 8.Â yum install msodbcsql 9.Â yum install unixODBC-utf16-devel #this step is optional but recommended*Install the ODBC Driver for Linux on RedHat 7
1.Â sudo su 2.Â yum-config-manager --add-repo https://apt-mo.trafficmanager.net/yumrepos/mssql-rhel7-release/ 3.Â yum-config-manager --enable mssql-rhel7-release 4.Â wget "http://aka.ms/msodbcrhelpublickey/dpgswdist.v1.asc" 5.Â rpm --import dpgswdist.v1.asc 6.Â yum remove unixODBC #to avoid conflicts during installation 7.Â yum update 8.Â yum install msodbcsql 9.Â yum install unixODBC-utf16-devel #this step is optional but recommended*Install the ODBC Driver for SUSE12
1.Â zypper ar https://apt-mo.trafficmanager.net/yumrepos/mssql-suse12-release/ "mssql" #To add the repo 2.Â wget "http://aka.ms/msodbcrhelpublickey/dpgswdist.v1.asc" 3.Â rpm --import dpgswdist.v1.asc 4.Â zypper remove unixODBC #to avoid conflicts 5.Â zypper update 6.Â zypper install msodbcsql 7.Â zypper install unixODBC-utf16-devel #this step is optional but recommended*
Note: Packages for SQL Server command line tools will be available soon. The above mentioned packages only install the ODBC Driver for SQL Server that enable connectivity from any C/C++ application.Try our Sample
Once you install the driver that runs on a supported Linux distro, you can use this C sample to connect to SQL Server/Azure SQL DB/Azure SQL DW. To download the sample and get started, follow these steps:
If you installed the driver using the manual instructions, you will have to manually uninstall the ODBC Driver and the unixODBC Driver Manager to use the deb/rpm packages. If you have any questions on how to manually uninstall, feel free to leave a comment below.
Please fill bugs/questions/issues on our issues page. We welcome contributions/questions/issues of any kind. Happy programming!Survey and Future Plans
Please take this survey to help prioritize features and scenarios for the next release of the ODBC Driver for Linux. Going forward we plan to expand SQL Server 16 Feature Support (example: Always Encrypted), improve test coverage, and fix bugs reported on our issues page.
Please stay tuned for upcoming releases that will have additional feature support and bug fixes. This applies to our wide range of client drivers including PHP 7.0, Node.js, JDBC and ADO.NET which are already available.
This post is by Joseph Sirosh, Corporate Vice President of the Data Group at Microsoft.
In June, weâve announced the general availability of SQL Server 2016, the worldâs fastest and most price-performant intelligence database for HTAP (Hybrid Transactional and Analytical Processing) with updateable, in-memory columnstores and advanced analytics through deep integration with R Services. As we officially move into the data-driven intelligence era, we continue to bring new capabilities to more applications, environments and users than ever before.
Today, weâre making several announcements to bring even more value to our customers.1. Public Preview of Azure Analysis Services
Azure Analysis Services (Azure AS) â Based on the proven analytics engine in SQL Server Analysis Services, Azure AS is an enterprise grade OLAP engine and BI modeling platform, offered as a fully managed platform-as-a-service (PaaS). Azure AS enables developers and BI professionals to create BI Semantic Models that can power highly interactive and rich analytical experiences in BI tools such as Power BI and Excel. Azure AS can consume data from a variety of sources residing in the cloud or on-premises (SQL Server, Azure SQL DB, Azure SQL DW, Oracle, Teradata to name a few) and surface the data in any major BI tool. You can provision an Azure AS instance in seconds, pause it, scale it up or down (planned during preview), based on your business needs. Azure AS enables business intelligence at the speed of thought! For more details, see the Azure blog.2. SQL Server 2016 DW Fast Track Reference Architecture, 145TB
We have collaborated with a number of our hardware partners on a joint effort to deliver validated, pre-configured solutions that reduce the complexity and drive optimization when implementing a data warehouse based on SQL Server 2016 Enterprise Edition. Today, I am happy to announce the Data Warehouse Fast Track (DWFT) reference architectures that certify that a SQL Server 2016 SMP unit can support active data sets of up to 145TB and maximum user size of 1.2 Petabytes of SQL Data in an all flash system. These reference architectures provide tested and validated configurations and resources to help customers build the right environment for their data warehouse solutions. Following these best practices and guidelines will yield these tangible benefits:
3. Data Migration Assistant and Database Experimentation Assistant
Today, I am also happy to announce that we are releasing the Data Migration Assistant (DMA) v2.0. DMA delivers scenarios that reduce the effort to upgrade to latest SQL Server 2016 from legacy SQL Servers by detecting compatibility issues that can impact database functionality after an upgrade. It recommends performance and reliability improvements for your target environment and then migrates the entire SQL Server database. Furthermore, DMA provides seamless assessments and migrations to SQL Azure VM.Â DMA assessments discover the breaking changes, behavioral changes and depreciated features that can affect your upgrades. DMA also discovers the new features in the target SQL Server platform that your applications can benefit from after an upgrade.Â DMA is the only tool providing comprehensive data platform movement capabilities, assisting DBAs with more than just schema and data migrations. DMA V1.0 was released on August 26, 2016 for general availability. Since then, DMA has been downloaded more than 2,000 times world-wide assessing more than 25,000 (37K Cores) databases with over 1,000 unique users.
Another tool that we are bringing to the market today is Database Experimentation Assistant (DEA). Itâs a new A/B testing solution for SQL Server upgrades. It enables customers to conduct experiments on database workloads across two versions of SQL Server. Customers who are upgrading from older SQL Server versions (starting 2005 and above) to any new version of the SQL Server will be able to use key performance insights, captured using a real world workload to help build confidence about upgrade among database administrators, IT management and application owners by minimizing upgrade risks. This enables truly risk-free migrations.4. Azure SQL Data Warehouse Expanded Free Trial
I am particularly excited about announcing the exclusive Azure SQL Data Warehouse free trial. Starting today, customers can request a one-month free trial for Azure SQL Data Warehouse. You can bring your data in and try out the capabilities of SQL Data Warehouse and complete POCs. This is a limited time offer, so submit your request now here. For PASS attendees (please watch my keynote) we have a special referral code that you can use while requesting the free trial. The SQL Data Warehouse team at PASS can also help you set up your free trial while you are there, you can find them at the Microsoft booth.
5. Cognitive Toolkit (formerly known as CNTK)
Starting today, we are announcing the availability of a beta for Microsoft Cognitive Toolkit (formerly CNTK), a free, easy-to-use, open-source, commercial-grade toolkit that trains deep learning algorithms to learn like the human brain. The Cognitive Toolkit enables developers and data scientists to reliably train faster than other available toolkits on massive datasets across several processors, including CPUs, GPUs and FPGAs, as well as multiple machines. Upgrades include more programming flexibility, advanced learning methods like reinforcement learning and extended API support for training and inference from Python, C++ and BrainScript so developers can use popular languages and network. Cognitive Toolkit is available under an open-source license to the public, and it is one of the most popular deep learning projects on GitHub. It is used to develop commercial grade AI in popular Microsoft products like Skype, Cortana, Xbox and Bing. Developers and researchers can start training with the Microsoft Cognitive Toolkit for free by visiting https://aka.ms/cognitivetoolkit.
SQL Server 2016 introduces the Real-time Operational Analytics feature, which allows you to run both transactional/operational and analytics workloads on the same system. Read on to learn how to create a filtered, non-clustered columnstore index.
I am happy to announce the new major release of Database .NET 19.8. It is an innovative, powerful and intuitive multiple database management tool, With it you can browse/grant objects, design tables, edit rows, run queries, generate scripts, analyze/monitor SQL and import/export/migrate/sync data with a consistent interface.
Free, All-In-One, Portable, Single executable file and Multi-language.
Major New features from version 19.0 to 19.8:
This post is by Joseph Sirosh, Corporate Vice President of the Data Group at Microsoft.
We are only 6 days away from PASS Summit 2016 (@sqlpass) which kicks off next Wednesday, October 26th, in Seattle. SQL PASS Summit (#SQLSummit, #sqlpass) is the world’s largest and most intensive technical training conference for Microsoft SQL Server (@SQLServer) and BI professionals. But more than that, itâs a conference â planned and presented by the SQL Server community for the SQL Server community (#sqlfamily). It has the most technical sessions, the largest number of attendees, the best networking, and the highest-rated sessions and speakers of any SQL Server event in the world. If you are in the Seattle area and would like to attend the Summit, you still have time to register to attend. If you cannot make it, the event will be live-streamed just like last year.
I am very excited to give a keynote again at PASS Summit this year. I have a lot of exciting things to share with you since last October. SQL Server 2016, an industry leader, and now packs an even bigger punch in the recent Gartner Magic Quadrant for Operational Database Management Systems. Microsoft is positioned as a leader, highest in execution and furthest in vision for 2 years in a row. SQL Server 2016 builds on this leadership, and is packed with powerful built-in features. As the least vulnerable database for six years in a row, SQL Server 2016 offers security that no other database can match. It also has the data warehouse (Data Warehouse Fast Track) with the highest price-performance, and offers end-to-end mobile BI solutions on any device at a fraction of the cost of other vendors. It provides tools to go beyond BI with in-database Advanced Analytics, integrating the R language and scalable analytics functions. The world around us – every business and nearly every industry – is being transformed by technology today, and SQL Server 2016 was built for this new world and to help businesses get ahead of todayâs disruptions.
I am very excited to present at PASS our cutting-edge content around SQL Server 2016 to solve real-world problems, and I will have a number of top-notch speakers joining me on the stage.
Rohan Kumar (@RohanKSQL), General Manager of Database Systems, will join me on the stage and talk about SQL Server 2016 and how customers around the world rely on SQL Server to build applications that deliver unmatched security and performance for their most demanding workloads.
Rohan will also present a fascinating demo Machine Learning @ 1,000,000 predictions per second, in which he will show real-time predictive fraud detection and scoring in SQL Server 2016. This is made possible by combining the performance of SQL Server in-memory OLTP as well as in-memory columnstore with R and Machine Learning. Rohan will also talk about HTAP – how SQL Server 2016 enables you to analyze both historical data as well as real-time in-memory processing of both transactions and analytics with 100x faster in-memory analytics and 30x faster in-memory OLTP. Rohan will talk about our âbetter togetherâ story â Windows Server 2016 and SQL Server 2016 working together. With the release of Windows Server 2016, SQL Server supports unparalleled scalability with a single SMP machine supporting today up to 24TB of memory and 640 cores.
Then joining Rohan on the stage will be one of our customers PROS (@PROS_Inc), and they will be talking about how SQL Server 2016 with R built-in enabled them to better serve their customers. One of their customers is an airline company that needed to respond to over 100 million unique, individualized price requests each day in under 200 milliseconds. Itâs practically impossible for humans to do this â understanding the market economics using all available data and to do so in under 200 milliseconds. The combination of SQL Server 2016 and Azure cloud provided the unified platform and global footprint that made it a lot easier for PROS to accomplish this and they will tell you all about it.
By the way, I highly recommend this free ebook that just came out, Data Science with Microsoft SQL Server 2016, to learn how to install, configure, and use Microsoftâs SQL Server R Services in data science projects and do the scenarios similar to PROS.
Rohan will tell you how Microsoft and Intel (@Intel) engineering work closely togetherÂ to tune and optimize our technologies for performance. This is an exciting year for both companies with the latest Intel Xeon processor E5 v4 family and introduction of SQL Server 2016. The performance increase in SQL Server 2016 has excelled when you refresh both hardware and software, with high double-digit performance gains. Furthermore, data is continually growing and this is another step Intel and Microsoft are taking together to enable scalable, cost effective, large data warehouses for advanced analytics. Intel and Microsoft engineering collaborated to enable a 100TB data warehouse within a single 4-socket Intel Xeon E7 v4 server, running Intel SATA SSD Storage. Intel will come and share these impressive performance results with PASS community.
Another fascinating customer that will join me at PASS is NextGames (@nxtgms). Kalle Hiitola (@kallehiitola) the CTO and Co-Founder of Next Games will tell their story. Next Games is a Finnish game developer and publisher based in Helsinki. They develop free to play games for iOS and Android. You might know some other big names coming from the same region like Supercell and Rovio. With the help of Azure they have been able to create globally scalable games that have close to 17 million downloads!
NextGames will talk about how Azure and our NoSQL service (Azure DocumentDB) and how it powers their games. Biggest of their titles is The Walking Dead No Manâs Land with millions of installs. For a modern player everything needs to happen instantly or they lose their attention and to prevent that they needed the most powerful database they could get â Azure DocumentDB (@DocumentDB).
Another amazing customer that will join me on the stage is Integral Analytics. Integral Analytics is a data science and analytics software company that provides core operational, planning, and valuation applications for utilities and other participants in the electric-energy industry. Integral Analyticsâ proprietary products help the largest energy companies in the United States and Canada navigate the âgrid edgeâ reliably and cost-efficiently. To perform its mission successfully, Integral Analytics needed a cloud provider that supports batch processing, real-time processing, easy development, and integrated advanced-analytics services. Come and find out directly from Integral Analytics about Azure SQL Data Warehouse (#AzureSqlDW).
Another amazing presenter that I will have with me on the stage will be Julie Koesmarno. Many of you know Julie very well as @MsSQLGirl. Julie will present an incredible demo called the âBig Cognitionâ. Software thus far has been âsterileâ and unable as it is to understand or use human emotions, or combine it with anything else. Using our cutting-edge technology, you can now extract emotional sentiments from images, videos, speech and text, and you can do it in bulk. You can now join emotions from image content with any other type of data you have and do incredibly powerful analytics and intelligence over it. This is what I mean by Big Cognition. Itâs not just extracting one piece of cognitive information at a time, not just about understanding an emotion or whether thereâs an object in an image, but rather itâs about joining all the extracted cognitive data with other types of data, so you can do some real magic with it. You really donât want to miss this demo by Julie.
Another partner I will be talking about at the keynote is eSmart Systems (@eSmart_Systems). I will talk about their Connected Drone project.
The objective of Connected Drone is to support inspections of power lines which, today, is performed either by ground crews walking miles and miles of power lines, or through dangerous helicopter missions to aerially monitor these lines (if there is one place you donât want humans to be in helicopters, itâs over high power lines). With Connected Drones, eSmart uses deep learning to automate as much of the inspection process as possible. As they fly over power lines, the drones stream live data through Azure for analytics. eSmart Systems uses different types of neural networks including deep neural networks (DNNs) to do this. They analyze both still images and videos from the drones and are able to recognize objects in real time. I will tell you how the combination of our powerful database engines with Deep Learning can enable previously unthought-of scenarios and products.
In conclusion, I will have a SQL Server MVP, Jen Stirrup (@jenstirrup) join me on the stage. Jen is very well-known in the SQL PASS community and is also a PASS board member. Jen wanted me to keep her demo a surpriseâŠ Youâll just have to watch it. All Iâll say is just two things: itâs something thatâs never been done at PASS Summit before and secondly I am really excited to have her present with me in the keynote. In the end, this demo might be all people will be talking about after the keynote.
SQL PASS is a must-watch, must-see, must do event! I look forward to see you all there next week.
As an IT company, it is crucial that you maintain uptime and monitor the performance of various processes. By making database load balancing a key element, you can ensure that your processes and websites run smoothly.
Reposted from the Cortana Intelligence & Machine Learning blog.
The world around us â every business and nearly every industry â is being transformed by technology. SQL Server 2016 was built for this new world and to help businesses get ahead of todayâs disruptions. With this free eBook, you will learn how to install, configure and use Microsoftâs SQL Server R Services in your data science and advanced analytics projects.
New milestone PostgresDAC release is out! Now with PostgreSQL 9.6 and Android target platform support.
From now and on NUMERIC fields are mapped to TFmtBcdField, that allows manipulating this type of values with immense flexibility.
This release contains breakdown change, thus developers should check their sources where NUMERIC fields are used.Full changelog:
You're welcome to download the PostgresDAC v3.2.0 right now at: http://microolap.com/products/connectivity/postgresdac/download/ or login to your private area on our site at http://microolap.com/my/downloads/Feedback
Please don't hesitate to ask any questions or report bugs with our Support Ticketing system available at http://www.microolap.com/support/
This post was authored by Rimma Nehme, Technical Assistant, Data GroupMachine Learning Model Management
If you are a data scientist, business analyst or a machine learning engineer, you need model management â a system that manages and orchestrates the entire lifecycle of your learning model. Analytical models must be trained, compared and monitored before deploying into production, requiring many steps to take place in order to operationalize a modelâs lifecycle. There isnât a better tool for that than SQL Server!SQL Server as an ML Model Management System
In this blog, I will describe how SQL Server can enable you to automate, simplify and accelerate machine learning model management at scale – from build, train, test and deploy all the way to monitor, retrain and redeploy or retire. SQL Server treats models just like data â storing them as serialized varbinary objects. As a result, it is pretty agnostic to the analytics engines that were used to build models, thus making it a pretty good model management tool for not only R models (because R is now built-in into SQL Server 2016) but for other runtimes as well.
SELECT * FROM [dbo].[models]
Figure 1: Machine Learning model is just like data inside SQL Server.
SQL Server approach to machine learning model management is an elegant solution. While there are existing tools that provide some capabilities for managing models and deployment, using SQL Server keeps the models âcloseâ to data, thus leveraging all the capabilities of a Management System for Data to be now nearly seamlessly transferrable to machine learning models (see Figure 2). This can help simplify the process of managing models tremendously resulting in faster delivery and more accurate business insights.
Figure 2: Pushing machine learning models inside SQL Server 2016 (on the right), you get throughput, parallelism, security, reliability, compliance certifications and manageability, all in one. Itâs a big win for data scientists and developers â you donât have to build the management layer separately. Furthermore, just like data in databases can be shared across multiple applications, you can now share the predictive models.Â Models and intelligence become âyet another type of dataâ, managed by the SQL Server 2016.Why Machine Learning Model Management?
Today there is no easy way to monitor, retrain and redeploy machine learning models in a systematic way. In general, data scientists collect the data they are interested in, prepare and stage the data, apply different machine learning techniques to find a best-of-class model, and continually tweak the parameters of the algorithm to refine the outcomes. Automating and operationalizing this process is difficult. For example, a data scientist must code the model, select parameters and a runtime environment, train the model on batch data, and monitor the process to troubleshoot errors that might occur. This process is repeated iteratively on different parameters and machine learning algorithms, and after comparing the models on accuracy and performance, the model can then be deployed.
Currently, there is no standard method for comparing, sharing or viewing models created by other data scientists, which results in siloed analytics work. Without a way to view models created by others, data scientists leverage their own private library of machine learning algorithms and datasets for their use cases. As models are built and trained by many data scientists, the same algorithms may be used to build similar models, particularly if a certain set of algorithms is common for a businessâs use cases. Over time, models begin to sprawl and duplicate unnecessarily, making it more difficult to establish a centralized library.
Figure 3: Why SQL Server 2016 for machine learning model management.
In light of these challenges, there is an opportunity to improve model management.Why SQL Server 2016 for ML Model Management?
There are many benefits to using SQL Server for model management. Specifically, you can use SQL Server 2016 for the following:
Here is the list of specific capabilities that makes the above possible:ML Model Performance:
In summary, SQL Server delivers the top-notch data management with performance, security, availability, and scalability built into the solution. Because SQL Server is designed to meet security standards, it has minimal total surface area and database software that is inherently more secure. Enhanced security, combined with built-in, easy-to-use tools and controlled model access can help organizations meet strict compliance policies. Integrated high availability solutions enable faster failover and more reliable backups – and they are easier to configure, maintain, and monitor, which helps organizations reduce the total cost of model management (TCMM). In addition, SQL Server supports complex data types and non-traditional data sources, and it handles them with the same attention – so data scientist can focus on improving the model quality and outsource all of the model management to SQL Server.Conclusion
Using SQL Server 2016 you can do model management with ease. SQL Server is unique from other machine learning model management tools, because it is a database engine, and is optimized for data management. The key insight here is that âmodels are just like dataâ to an engine like SQL Server, and as such we can leverage most of the mission-critical features of data management built into SQL Server for machine learning models. Using SQL Server for ML model management, an organization can create an ecosystem for harvesting analytical models, enabling data scientists and business analysts to discover the best models and promote them for use. As companies rely more heavily on data analytics and machine learning, the ability to manage, train, deploy and share models that turn analytics into action-oriented outcomes is essential.
The new era of data driven and intelligent applications is here â powered by a robust data platform and a secure cloud infrastructure. The cloud gives us the power to collect and integrate data from an enormous variety of sources, to process big data at amazing scale and economics, to dramatically simplify development and deployment, and offer amazing intelligent APIs and applications as hosted services.
Microsoftâs data platform and Azure data services are packed with tons of features to make apps more customer-friendly, more secure, more performant, more intelligent and most of all â easier to develop! From the security benefits of Dynamic Data Masking, built-in intelligence/analytics, to the performance improvements of in-memory OLTP, Microsoftâs data platform and Azure provides developers with the tools they need to develop modern, smarter and Intelligent applications.
Sound good, right?Â Join Microsoft data experts Tobias Ternstrom, Lindsey Allen, and many more at the PASS SummitÂ pre-conference session, Taking A Hands-On Approach to Building Intelligence into Data Driven Applications.Â Bring your device to this hands-on session and get a full day of app building and exploring these features in-depth with Microsoft Product Engineering walking you through each step. The data for development immersion program includes 15 labs based on real-life scenarios developers face every day.
At the end of the day you will walk away with sample code and knowledge on how to create your own data-driven intelligence application.
Assume you’ve been given a directive by IT management to "tune" things. What strategies are available and how do you judge which ones are best? Considering the current economic climate, can any be implemented on a shoestring budget? Can any of these tasks be assigned to resources with basic DBA skills and still be productively accomplished?
We are excited to announce the public availability of SQL Server 2016 Express Edition in Windows Containers! The image is now available on Docker Hub and the build scripts are hosted on our SQL Server Samples GitHub repository. This image can be used in both Windows Server Containers as well as Hyper-V Containers.
We hope you will find these images useful and leverage them for your container-based applications!Why use SQL Server in containers?
SQL Server 2016 in a Windows container would be ideal when you want to:
Before you can get started with the SQL Server 2016 Express Edition image, youâll need a Windows Server 2016 or Windows 10 host with the latest updates, the Windows Container feature enabled, and the Docker engine.
Please find the details for each of these requirements below.
Below are the Docker pull and run commands for running SQL Server 2016 Express instance in a Windows Container.
Make sure that the mandatory sa_password environment variable meets the SQL Server 2016 Password Complexity requirements.
First, pull the image
docker pull microsoft/mssql-server-2016-express-windows
Then, run a SQL Server container
Running a Windows Server Container (Windows Server 2016 only):
docker run -d -p 1433:1433 ––env sa_password=<YOUR_PWD> microsoft/mssql-server-2016-express-windows
Running a Hyper-V Container (Windows Server 2016 or Windows 10):
docker run -d -p 1433:1433 ––env sa_password=<YOUR_PWD> ––isolation=hyperv microsoft/mssql-server-2016-express-windowsConnecting to SQL Server 2016 From within the container
An easy way to connect to the SQL Server instance from inside the container is by using the sqlcmd utility.
First, use the docker ps command to get the container ID that you want to connect to and use it to replace the parameter placeholder â<DOCKER_CONTAINER_ID>â in the commands below. You can use the docker exec -it command to create an interactive command prompt that will execute commands inside of the container.
You can connect to SQL Server by using either Windows or SQL Authentication.
Windows authentication using container administrator account
docker exec -it <DOCKER_CONTAINER_ID> sqlcmd
SQL authentication using the system administrator (SA) account
docker exec -it <DOCKER_CONTAINER_ID> sqlcmd -S. -UsaFrom outside the container
One of the ways to access SQL Server 2016 from outside the container is by installing SQL Server Management Studio (SSMS). You can install and use SSMS either on the host or on another machine that can remotely connect to the host .
Connect from SSMS installed on the host
To connect from SSMS installed on the host, youâll need the following information:
Connect from SSMS on another machine (other than the Host Environment)
To connect from SSMS installed on another machine (that can connect to the host), youâll need the following information:
Please refer to this link for all SQL Server 2016 features that are supported on a Windows Server Core installation.Developing Using Windows 10 Containers
Check out this blog post by Alex Ellis, Docker Captain, on how to use SQL Server 2016 Express Edition in a Windows container as part of an application development and test environment on Windows 10.
Oracle 12c offers a new way to match patterns in your data, the MATCH_RECOGNIZE function. Read on to see the syntax and the function in action.
Redwood Shores, Calif.—Oct 12, 2016
Yesterday, the United States District Court for the District of Nevada issued two significant rulings in Oracleâs litigation against Rimini Street and its CEO Seth Ravin.
The first ruling is a permanent injunction barring Rimini Street from continuing to infringe Oracleâs copyrights and other unlawful acts. The Court previously determined that a permanent injunction was warranted, noting Rimini Streetâs âcallous disregard for Oracleâs copyrights and computer systems when it engaged in the infringing conductâ and that âRiminiâs business model was built entirely on its infringement of Oracleâs copyrighted software and its improper access and downloading of data from Oracleâs website and computer systems.â
The Courtâs four-page permanent injunction prohibits certain copying, distribution, and use of Oracleâs copyrighted software and documentation by Rimini Street and also imposes limitations on Rimini Streetâs access to Oracleâs websites. The order states, for example, that Rimini Street may not use a customerâs software environment âto develop or test software updates or modifications for the benefit of any other licensee.â The order also prohibits Rimini Streetâs preparation of certain derivative works from Oracleâs copyrighted software. The order applies not only to Rimini Street, but also âits subsidiaries, affiliates, employees, directors, officers, principals, and agents.â
âThis permanent injunction imposes important restrictions on Rimini Street,â said Oracleâs General Counsel Dorian Daley, âand Oracle is grateful that the Court has taken steps to prevent continuing unlawful acts by Rimini Street and its executives.â
Although Rimini Street has stated that there was âno expected impactâ from any injunction, Rimini Street also told the Court that it âcould suffer significant harm to its current business practices if the proposed injunction were entered,â which it now has been. These contradictory positions raise the issue of whether Rimini is misleading the Court or making misrepresentations to customers.
In its second ruling, the Court awarded Oracle approximately $27.7 million in prejudgment interest, with an additional amount of prejudgment interest to be awarded based on the date of the final judgment. This is in addition to the $50 million jury verdict and the $46.2 million in attorneysâ fees and costs awarded to Oracle last month.
Oracle offers a comprehensive and fully integrated stack of cloud applications and platform services. For more information about Oracle (NYSE:ORCL), visit www.oracle.com.Trademarks
Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.Safe Harbor
The following is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into any contract. It is not a commitment to deliver any material, code, or functionality, and should not be relied upon in making purchasing decisions. The development, release, and timing of any features or functionality described for Oracle's products remains at the sole discretion of Oracle Corporation.Talk to a Press Contact
Follow Oracle Corporate
This post is by Joseph Sirosh, Corporate Vice President of the Data Group at Microsoft.Transactional Workloads + Intelligence
Online transaction processing (OLTP) database applications have powered many enterprise use-cases in recent decades, with numerous implementations in banking, e-commerce, manufacturing and many other domains. Today, Iâd like to highlight a new breed of applications that marry the latest OLTP advancements with advanced insights and machine learning. In particular, Iâd like to describe how companies can predict a million events per second with the very latest algorithms, using readily available software. We have shown this demo at the Microsoft Machine Learning and Data Science Summit and my General Session at Ignite in Atlanta, Georgia. You can watch both online. The predictive model was based on a boosted decision tree algorithm with 50 trees and 33 features.
Take credit card transactions, for instance. These can trigger a set of decisions that are best handled with predictive models. Financial services companies need to determine whether a particular transaction is fraudulent or legitimate.
As the number of transactions per second (TPS) increase, so does the number of predictions per second (PPS) that organizations need to make. The Visa network, for instance, was capable of handling 56,000 TPS last year and managed over 100 billion yearly transactions. With each transaction triggering a set of predictions and decisions, modern organizations have a need for a powerful platform that combines OLTP with a high-speed prediction engine. We expect that an increasing number of companies will need to hit 1 million predictions per second (PPS) or more in coming years.
What kind of architecture would enable such use cases? At Microsoft, we believe that computing needs to take place where data lives. This minimizes data movement, eliminates the costs and security risks associated with data movement and the prediction engine sits close to the database (i.e., in-database analytics). Moreover, the predictive models can be shared by multiple applications. Thatâs precisely how SQL Server 2016 was designed.
Take the credit card fraud detection example I mentioned above â one can handle it in the following manner:
This is shown visually below:
The above architecture is very versatile. In addition to using it in fraud detection, weâve applied this architecture to perform what-if analysis on an auto loan dataset.Analytical Workloads + Intelligence
Imagine a loan application where a financial services company needs to determine if a loan will be repaid on time. Similarly to predicting fraudulent transactions, you can leverage SQL Server 2016 as a Scoring Engine to predict âbadâ loans. Loans that indicate good repayment behavior are considered âgoodâ and loans that indicate less than perfect repayment behavior are considered âbadâ. Imagine scanning through millions of loan applications and being able to predict â within seconds â which loans will default. Now imagine a business analyst launching the same exercise while modeling a scenario where the Federal Reserve increases interest rates. Our loan default prediction model was able to reach and exceed a staggering 1,250,000 predictions per second, completing the what-if analysis within 15-18 seconds. This capability now enables our customers to have near real-time predictive analytics. The architecture is shown visually below:
One of the common tasks from customers is to provide an intelligent method of predicting how changing factors like interest rates, loan terms or even a memberâs credit score would affect the charge-off probability. You can specify a what-if input for an increased interest rate and score the open loans with the new proposed interest rate using parallel threads which call a SQL Server stored procedure to invoke the scoring model on the open loans. You can take these predictions and compare the base predictions with the what-if predictions. Then you can study the probability of HIGH charge-offs increasing with an increase in interest rate and how it may effect various branches of your business. Such near real-time predictive analytics capabilities minimize research bias, dramatically increase business flexibility and focus on attributes that matter which results in higher profitability.
At Ignite, we had Jack Henry & Associates on the stage with me. They provide more than 300 products and services to over 10,000 credit unions and enable them to process financial transactions plus automate their services. Using SQL Server as a Scoring Engine, enabled their vision of building an intelligent enterprise data warehouse which would help their customers increase their productivity. They have been working with Microsoft to leverage SQL Server with built-in R capability to build intelligence into their current data warehousing service portfolio. Such an intelligent data warehouse helps credit unions and financial services become more flexible and react to situations in a data-driven manner. We see opportunities in applying such models within the database to customer churn predictions, predicting loan portfolio changes and a host of other scenarios. Several banking and insurance companies rely on very complex architectures to do predictive analytics and scoring today. Using the architecture outlined in this blog, businesses can do this in a dramatically simpler and faster manner.
The possibilities are endless.
Weâve posted several samples on GitHub. The available templates are listed below.
This is how companies are predicting at the speed of data, today.
PASS Summit 2016 is nearly upon us. With only 4 weeks until the big event, now is the time to register!
PASS Summit 2016 is community-driven event with three days of technical sessions, networking, and professional development. Donât miss your chance to stay up to date on the latest and greatest Microsoft data solutions along with 4,000 of your data professional and developer peers.Whatâs new this year?Â So many things!
âą PASS Summit is not just for DBAs.Â With nearly 1,000 developers attending the event, Microsoft has increased the number of sessions focused on application development and developer tools by 60%.
âą While many people attend PASS Summit to grow fundamental database skills, we know that many attendees are very experienced, senior data professionals so we increased the number of deep technical sessions by half.
âą We have also added a new type of session called a Chalk Talk. These are Level 500 sessions with Microsoft senior program management hosting open Q&A in a collegiate style setting.Â Seating is limited to 50 so youâll want to get there early to claim your spot.
In addition to these enhancements, Microsoft has also increased investment in sending employees onsite to talk with attendees.Â Theyâll be easy to spot â all 500 Microsoftees will be wearing bright fuchsia t-shirts.Â You can find them in big numbers the Day 1 keynote, Microsoft booth, SQL Clinic, Wednesdayâs Birds of a Feather luncheon, Thursdayâs WIT luncheon, and of course in our big booth in the Expo Hall.Have a technical challenge or need architecture advice?
SQL Clinic is the place to be. SQL Clinic is the hub of technical experts from SQLCAT, Tiger Team, CSS, and others. Whether you are looking for SQL Server deployment support, have a troublesome technical issue, or developing an application the experts at SQL Clinic will have the right advice for you.
Are you a member of a PASS chapter or virtual chapter?Â If so, remember to take advantage of the $150 discount code.Â Contact your chapter leader for details.
Sending your whole team? There is also a great group discount for companies sending five or more employees.
Once you get a taste for the learning and networking waiting for you at PASS Summit, we invite you to join the conversation by following @SQLServer on Twitter as well as @SQLPASS and #sqlsummit. Weâre looking forward to an amazing event, and canât wait to see everyone there!
Stay tuned for regular updates and highlights on Microsoft and PASS activities planned for this yearâs conference.