Skip to content

Software Development News: .NET, Java, PHP, Ruby, Agile, Databases, SOA, JavaScript, Open Source

Methods & Tools

Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!

Database

Microsoft Business Intelligence PASS keynote: Five minutes to sign up; five minutes to WOW!

See how you can get real-time interactive visualizations of customer data, as Microsoft Corporate Vice President James Phillips demonstrates exciting new capabilities available with Power BI and Microsoft’s powerful data platform, including SQL Server, SQL Server Analysis Services (SSAS), SQL Server Reporting Services (SSRS), Datazen, and Azure SQL Database. In this video of Phillips’ PASS Summit session, you’ll learn about all the new capabilities available now, as well as Microsoft’s roadmap for business intelligence (BI). Among the exciting demonstrations Phillips offers, you’ll see how Power BI history and cross-filtering along with integration of Microsoft Research Machine Learning and the Azure cloud platform, you can take advantage of a hybrid solution that offers you game-changing business insights.


In its most recent Agile BI Wave report, Forrester Research shows Microsoft BI moving all the way to the upper right, as a leader in BI. Forrester says that with Microsoft BI innovations, the “sleeping giant” has awoken.” In fact, Phillips notes that Power BI makes Microsoft the world’s fastest growing visualization platform, with more than 185 countries signing up for—and truly using—Power BI. Microsoft is passionate about making Power BI quick, easy, and free so that customers have the experience of “five minutes to sign-up; five minutes to WOW!” with Power BI.

With all the insights into customer and business activity available in today’s digital world, BI is a requirement for success. Check out this informative and fascinating video and find out how Microsoft is acting on the fact that “software is the greatest driving force in bringing business closer to customers … [and] software makes data…Organizations that turn data into value through insights will thrive.”

Click here to learn about currently available BI capabilities and how they fit together to form an unmatched solution that’s moving forward and an unmatched pace to deliver insights.

Categories: Database

Data security, SQL Server 2016, and your business

Security is unquestionably a major priority for Microsoft. A recent news story reported that the company “is spending $1 billion a year to make Microsoft products more secure.” The Microsoft data platform, including SQL Server and Azure SQL Database, is at the top of the list of products investing in security. But, be aware that a commitment to data security is actually nothing new. SQL Server has long been recognized for its outstanding security record: According to the National Institute of Standards and Technology(NIST)1 public security board, for the past six years, SQL Server has had the fewest security vulnerabilities when compared with the major database vendors. In addition, SQL Server has been deemed “the most secure database” by the Information Technology Industry Council (ITIC). Despite this excellent security record, Microsoft is not content to rest on its laurels and is continuing to invest in security, providing customers with new and improved tools to secure data and applications.

From an IT infrastructure and compliance perspective, the importance of protecting data is clear. Witness the fact that security has been identified as one of the “Eight emerging data center trends to follow in 2016.” But data protection also has profound business implications and can even be a competitive differentiator by helping drive customer loyalty and retention, create opportunities for premium offers and new sources of revenue, and protect future revenue streams, according to Forrester Research 2.To help deal with the complexity and scope of data security — and diminish risks to your business — Microsoft provides an across-the-board, in-depth security approach that includes application security, network security, and database security.

Data Security and SQL Server

Playing into this overall approach, SQL Server 2016 and Azure SQL Database include advanced, layered security functionality to help protect data itself as well as access to that data, and then provide monitoring capabilities. Data security features include (but are not limited to) the following:

  • Always Encrypted enables encryption inside client applications without revealing encryption keys to SQL Server. It allows changes to encrypted data without the need to decrypt it first.
  • Transparent Data Encryption (TDE) protects data at rest by encrypting all the user data in data files. TDE prevents users from attaching or restoring a database to another server as a way to access the data.
  • Support for Transport Layer Security (TLS), which has now been updated to version 1.2, protects data in transit and offers protection from such tactics as man-in-the-middle attacks.
  • Dynamic Data Masking (DDM) and Row-Level Security (RLS) help developers build applications that require restricted direct access to certain data as a means of preventing users from seeing specific information.

This layered approach to data security and Microsoft’s overall commitment to advancing security and privacy protection address important considerations for business today. Upcoming blogs will go into deep technical detail on these security capabilities, but examining a business scenario can help illuminate the business benefits that data security can help ensure.

Business implications

Data has become not only a business asset, but it is now also a competitive differentiator: A company that can ensure that customer and business data are secured has a competitive edge over a company that does not make data security a priority. This means that for business and technical decision-makers to enable their businesses to compete effectively, they need a data platform with built-in security features and they need a strategy that takes advantage of the built-in security capabilities.

The business implications of data security range from speeding up customer service, to impacting the bottom line, to protecting shareholder value. Underscoring the potential bottom-line concerns of financial executives, a recent survey found that 66 percent of CFOs consider security to be a high or very high priority. Even at the end-user level, the potential business impact of exposing sensitive data is recognized: Another recent survey discloses that “71 percent of end users say that they have access to company data they should not be able to see.”

How can Microsoft’s data security capabilities ease such concerns? Consider just one example showing how Dynamic Data Masking, as a part of your data security program, can help you address the point raised by those end users who admitted they had access to data (such as Social Security Numbers or health details) that they shouldn’t be able to view. For example, suppose you have a call center where representatives deal with customer billing questions. When a customer record comes up, the representative needs to see certain information to answer questions. But some customer information, such as specific personal health details, need to remain confidential for HIPAA compliance. With Dynamic Data Masking, IT administrators can take simple steps to define policies, or rules, to mask any personally identifiable information that is not needed for the customer interaction. This way, the representative can view a customer record without having access to confidential information. Customer information is secured, but at the same time, customer service is able to answer questions by accessing appropriate data without compromising privacy.

Commitment to security built-In

As the article cited above emphasizes, Microsoft is spending $1 billion per year to ensure that its products are secured so that businesses are protected. SQL Server and Azure SQL Database are continuously building-in state-of-the-industry security technologies as part of this ongoing commitment to security. For business, this means you don’t have to pay extra to give IT staff security tools that are easy to deploy and maintain — those tools are built into Microsoft’s data platform. At the same time, businesses can build data security infrastructure that supports customers and provides a competitive edge. To learn more about Microsoft’s data security approach, see the Security Center for SQL Server Database Engine and Azure SQL Database and the SQL Security Blog.

See the other posts in the SQL Server 2016 blogging series

1. National Institute of Standards and Technology Comprehensive Vulnerability Database update 10/2015

2. The Future of Data Security And Privacy: Growth And Competitive Differentiation Vision: The Data Security And Privacy Playbook, John Kindervag, Heidi Shey, and Kelley Mak, Forrester, July 10, 2015

Categories: Database

Microsoft Business Intelligence PASS keynote: Five minutes to sign up; five minutes to WOW!

See how you can get real-time interactive visualizations of customer data, as Microsoft Corporate Vice President James Phillips demonstrates exciting new capabilities available with Power BI and Microsoft’s powerful data platform, including SQL Server, SQL Server Analysis Services (SSAS), SQL Server Reporting Services (SSRS), Datazen, and Azure SQL Database. In this video of Phillips’ PASS Summit session, you’ll learn about all the new capabilities available now, as well as Microsoft’s roadmap for business intelligence (BI). Among the exciting demonstrations Phillips offers, you’ll see how Power BI history and cross-filtering along with integration of Microsoft Research Machine Learning and the Azure cloud platform, you can take advantage of a hybrid solution that offers you game-changing business insights.

In its most recent Agile BI Wave report, Forrester Research shows Microsoft BI moving all the way to the upper right, as a leader in BI. Forrester says that with Microsoft BI innovations, the “sleeping giant” has awoken.” In fact, Phillips notes that Power BI makes Microsoft the world’s fastest growing visualization platform, with more than 185 countries signing up for—and truly using—Power BI. Microsoft is passionate about making Power BI quick, easy, and free so that customers have the experience of “five minutes to sign-up; five minutes to WOW!” with Power BI.

With all the insights into customer and business activity available in today’s digital world, BI is a requirement for success. Check out this informative and fascinating video and find out how Microsoft is acting on the fact that “software is the greatest driving force in bringing business closer to customers … [and] software makes data…Organizations that turn data into value through insights will thrive.”

Click here to learn about currently available BI capabilities and how they fit together to form an unmatched solution that’s moving forward and an unmatched pace to deliver insights.

Categories: Database

Data security, SQL Server 2016, and your business

Security is unquestionably a major priority for Microsoft. A recent news story reported that the company “is spending $1 billion a year to make Microsoft products more secure.” The Microsoft data platform, including SQL Server and Azure SQL Database, is at the top of the list of products investing in security. But, be aware that a commitment to data security is actually nothing new. SQL Server has long been recognized for its outstanding security record: According to the National Institute of Standards and Technology(NIST)1 public security board, for the past six years, SQL Server has had the fewest security vulnerabilities when compared with the major database vendors. In addition, SQL Server has been deemed “the most secure database” by the Information Technology Industry Council (ITIC). Despite this excellent security record, Microsoft is not content to rest on its laurels and is continuing to invest in security, providing customers with new and improved tools to secure data and applications.

From an IT infrastructure and compliance perspective, the importance of protecting data is clear. Witness the fact that security has been identified as one of the “Eight emerging data center trends to follow in 2016.” But data protection also has profound business implications and can even be a competitive differentiator by helping drive customer loyalty and retention, create opportunities for premium offers and new sources of revenue, and protect future revenue streams, according to Forrester Research 2.To help deal with the complexity and scope of data security — and diminish risks to your business — Microsoft provides an across-the-board, in-depth security approach that includes application security, network security, and database security.

Data Security and SQL Server

Playing into this overall approach, SQL Server 2016 and Azure SQL Database include advanced, layered security functionality to help protect data itself as well as access to that data, and then provide monitoring capabilities. Data security features include (but are not limited to) the following:

  • Always Encrypted enables encryption inside client applications without revealing encryption keys to SQL Server. It allows changes to encrypted data without the need to decrypt it first.
  • Transparent Data Encryption (TDE) protects data at rest by encrypting all the user data in data files. TDE prevents users from attaching or restoring a database to another server as a way to access the data.
  • Support for Transport Layer Security (TLS), which has now been updated to version 1.2, protects data in transit and offers protection from such tactics as man-in-the-middle attacks.
  • Dynamic Data Masking (DDM) and Row-Level Security (RLS) help developers build applications that require restricted direct access to certain data as a means of preventing users from seeing specific information.

This layered approach to data security and Microsoft’s overall commitment to advancing security and privacy protection address important considerations for business today. Upcoming blogs will go into deep technical detail on these security capabilities, but examining a business scenario can help illuminate the business benefits that data security can help ensure.

Business implications

Data has become not only a business asset, but it is now also a competitive differentiator: A company that can ensure that customer and business data are secured has a competitive edge over a company that does not make data security a priority. This means that for business and technical decision-makers to enable their businesses to compete effectively, they need a data platform with built-in security features and they need a strategy that takes advantage of the built-in security capabilities.

The business implications of data security range from speeding up customer service, to impacting the bottom line, to protecting shareholder value. Underscoring the potential bottom-line concerns of financial executives, a recent survey found that 66 percent of CFOs consider security to be a high or very high priority. Even at the end-user level, the potential business impact of exposing sensitive data is recognized: Another recent survey discloses that “71 percent of end users say that they have access to company data they should not be able to see.”

How can Microsoft’s data security capabilities ease such concerns? Consider just one example showing how Dynamic Data Masking, as a part of your data security program, can help you address the point raised by those end users who admitted they had access to data (such as Social Security Numbers or health details) that they shouldn’t be able to view. For example, suppose you have a call center where representatives deal with customer billing questions. When a customer record comes up, the representative needs to see certain information to answer questions. But some customer information, such as specific personal health details, need to remain confidential for HIPAA compliance. With Dynamic Data Masking, IT administrators can take simple steps to define policies, or rules, to mask any personally identifiable information that is not needed for the customer interaction. This way, the representative can view a customer record without having access to confidential information. Customer information is secured, but at the same time, customer service is able to answer questions by accessing appropriate data without compromising privacy.

Commitment to security built-In

As the article cited above emphasizes, Microsoft is spending $1 billion per year to ensure that its products are secured so that businesses are protected. SQL Server and Azure SQL Database are continuously building-in state-of-the-industry security technologies as part of this ongoing commitment to security. For business, this means you don’t have to pay extra to give IT staff security tools that are easy to deploy and maintain — those tools are built into Microsoft’s data platform. At the same time, businesses can build data security infrastructure that supports customers and provides a competitive edge. To learn more about Microsoft’s data security approach, see the Security Center for SQL Server Database Engine and Azure SQL Database and the SQL Security Blog.

See the other posts in the SQL Server 2016 blogging series

1. National Institute of Standards and Technology Comprehensive Vulnerability Database update 10/2015

2. The Future of Data Security And Privacy: Growth And Competitive Differentiation Vision: The Data Security And Privacy Playbook, John Kindervag, Heidi Shey, and Kelley Mak, Forrester, July 10, 2015

Categories: Database

Load Testing Your Big Data Application

Database Journal News - Thu, 01/14/2016 - 09:01

Many big data applications are designed, built and installed without a formal load test. This is unfortunate, as load testing gives the database administrator quite a lot of valuable information. It may make the difference between poor and acceptable big data performance. This article reviews big data application implementation practices with some proactive tips on load testing to make your production implementation a success.

Categories: Database

PipelineDB Enterprise Now Available

PostgreSQL News - Thu, 01/14/2016 - 01:00

SAN FRANCISCO, CA – January 14th, 2016 - PipelineDB (www.pipelinedb.com), a pioneer in streaming analytics database technology, today announced the release of PipelineDB Enterprise, a commercial edition of the company’s open-source database. PipelineDB Enterprise extends the functionality of the company’s open-source relational database, which runs SQL queries continuously on streaming data for realtime analytics applications, and adds horizontal scaling, high availability, and realtime alerting capabilities. PipelineDB and PipelineDB Enterprise are based on PostgreSQL 9.4 and will be refactored to be compatible with PostgreSQL 9.5 this quarter.

This new product enables anybody familiar with the ubiquitous SQL to build scalable, fault-tolerant analytics applications such as realtime reporting dashboards and realtime monitoring systems. The launch of PipelineDB Enterprise reinforces the company’s mission to give a wide range of developers an easy way to build scalable realtime applications without writing custom application code.

“An increasing number of organizations are beginning to think about data as perpetually moving streams rather than static and at rest. PipelineDB is highly conducive to applying this new mindset to information in ways that generate real business value, and PipelineDB Enterprise makes this possible at the largest scale,” said Derek Nelson, the company’s CEO and Co-Founder.

Since launching the company in July, 2015 the company says it has seen significant adoption in verticals including finance, telecommunications, advertising, gaming, and networking. They report having several paying enterprise customers, although they were unable to offer specific names due to being under NDA.

“Soon after we released our open-source core product, many of our biggest users wanted to pay for easy scaling, high availability, and support, which is exactly what PipelineDB Enterprise delivers. We have several customers currently piloting PipelineDB Enterprise and we are excited to announce its public availability today,” said Jeff Ferguson, the company’s President and Co-Founder.

PipelineDB Enterprise Highlights:

· Shared-nothing architecture for seamless horizontal scaling · Replication and failover of hardware nodes for high availability · Realtime alerting capabilities · 24/7 enterprise support

Availability

PipelineDB Enterprise is offered under a commercial license for one and three year terms. To try PipelineDB Enterprise contact enterprise@pipelinedb.com. PipelineDB is free, open-source, and offered under the GPLv3 license. It can be downloaded from the company’s website at www.pipelinedb.com.

About PipelineDB:

PipelineDB is the leader in streaming analytics database technology. Their open-source relational database runs SQL queries continuously on streaming data and is used by the world’s leading companies to power realtime analytics systems. The company is based in San Francisco, CA and is backed by prominent Silicon Valley investors including Y-Combinator, SV Angel, Data Collective, Susa Ventures, and others.

Follow us on twitter @pipelinedb or email us at info@pipelinedb.com for more information.

Media Contact

Jeff Ferguson President and Co-Founder jeff@pipelinedb.com

Categories: Database, Open Source

Navicat for PostgreSQL version 11.2 - introducing Navicat Cloud Collaboration & support PostgreSQL 9.5

PostgreSQL News - Tue, 01/12/2016 - 01:00

PremiumSoft today released an upgraded version of Navicat for PostgreSQL version 11.2, with full support for PosgreSQL 9.5.

In the latest version of Navicat for PostgreSQL, version 11.2 is introducing a new feature - PL/PGSQL Code Debugger and Navicat Cloud Collaboration, a newly developed solution to help teams work better together.

"Navicat Cloud is a cloud-based solution that will align your development team for better collaboration and across platforms” said Ken Lin, Software Development Director at PremiumSoft CyberTech Ltd. "Navicat Cloud portal is an excellent solution that allows team members to instantly view a project shared by a colleague without a Navicat license, while the Desktop and iOS applications allow you to manage your projects such as sharing connections, queries and models.

Major New Features:
  • Navicat Cloud Collaboration
    1. Create/Modify/Delete projects.
    2. Add members to project for sharing connection settings, queries and models.
    3. View Project Activity Log.
  • PostgreSQL Objects
    1. Support Foreign Server with User Mapping.
    2. Support Foreign Table.
    3. Support PostgreSQL Debugger.
    4. Enhanced Object Designers.
  • Data Modeling Tool
  • Support Model Conversion.
  • Support Views.
  • Enhanced Table Designer. Navicat Cloud Collaboration Capabilities With Navicat Cloud Collaboration, Navicat customers now have the ability to invite a colleague to work together on a project— to assign roles to members and have visibility into the activities in the Activity Log.
Plans, Pricing and Availability

Starting today, Navicat Cloud will be available for all new and existing Navicat customers— Navicat Cloud Basic Plan (free) includes up to 150 units of storage & 3 projects; Navicat Cloud Pro Plan features 5,000 units of storage & up to 500 projects for US$9.99 per month / US$99 per year. For more details, please visit: http://www.navicat.com/navicat-cloud

A Free 14-day Trial is also available for download, for more details please go to : http://www.navicat.com/download/navicat-for-postgresql

To learn more about Navicat for PostgreSQL, please visit: http://www.navicat.com/products/navicat-for-postgresql

To learn more about Navicat Cloud Collaboration, please visit: http://www.navicat.com/navicat-collaboration

About Navicat

Navicat develops the leading database management and development software. One of its top-rated products, Navicat Premium, allows you to access up to 6 databases all-in-one including MySQL, MariaDB, SQL Server, SQLite, Oracle, and PostgreSQL, eliminating workflow disruption to leverage users’ time and increasing productivity and efficiency.

About PremiumSoft

PremiumSoft CyberTech Ltd. is a multinational corporation headquartered in Hong Kong, the company was founded in 1999 and has developed numerous award-winning products over the years.

Categories: Database, Open Source

Oracle Enhances Retail Suite, Adds New Retail Cloud Services to Largest Portfolio of Enterprise SaaS Applications

Oracle Database News - Mon, 01/11/2016 - 21:51
Press Release Oracle Enhances Retail Suite, Adds New Retail Cloud Services to Largest Portfolio of Enterprise SaaS Applications Comprehensive new release of cloud services, hardware, and software solutions enable retailers to enhance engagement with consumers online, in stores and via mobile

Redwood Shores, Calif.—Jan 11, 2016

To help retailers deliver a consistent experience anywhere customers choose to shop, Oracle today introduced a comprehensive suite of cloud and on-premise solutions embedded with the industry-leading analytics that retailers worldwide can use to personalize offers, streamline operations, and increase sales and margins.

“Engaging and maintaining customer loyalty across channels requires exceptional insight and integration from planning to marketing to fulfillment, and with this release Oracle brings all the pieces together. For the first time, retailers can use Oracle Cloud services to pinpoint promotions customers want to see, forecast and meet demand for items with unique attributes and manage inventory in a singular, more effective manner across commerce, store and wholesale channels,” said Jill Standish (Puleri), senior vice president and general manager, Oracle Retail.

Oracle’s retail-specific solutions, combined with its broader product and service offering, are used by thousands of retailers worldwide to strengthen customer relationships, build brand loyalty and increase profits.  Now the company is introducing a major upgrade and expansion to its offering of cloud services and solutions.

With Oracle Retail Release 15, the company has embedded more of its leading business intelligence capabilities throughout retailers’ most critical processes and extended greater access and mobility via the cloud.  The new release also upgrades and integrates Oracle Retail Xstore Point-of-Service and merchandising solutions to share insights between stores, merchandising, pricing and sales audit processes.

Retailers Gain Faster Value with Four New Oracle Cloud Services

In a move that expands cloud access to its best-in-class solutions, Oracle is adding four new retail-specific cloud services to the industry’s most comprehensive SaaS, PaaS and IaaS offering. 

  • Using Oracle Retail Sales and Productivity Cloud Service, retailers can gain real-time insights into comparative sales, salesperson productivity, merchandise productivity, store sales, and store traffic.
  • Retailers can use Oracle Retail Merchandise Financial Planning Cloud Service to facilitate collaborative planning across commerce, store and wholesale channels and to better align the day-to-day decisions that impact sales, inventory buys and promotions with topline business strategy.
  • New levels of business intelligence embedded in the Oracle Retail Demand Forecasting Cloud Service enable retailers to identify and fine tune product selection, pricing and promotions for items sharing similar attributes – such as patterns or images that are “hot” fashion trends in the apparel industry or specific flavors of yogurt for grocers.
  • Retailers can increase sales and improve customer service by using Oracle Retail Customer Segmentation Science Cloud Service to identify why customers buy certain items and tailor offerings to meet their needs.
Customers Enjoy a Better Store Experience When Retailers Deploy Oracle Retail Xstore Point-of-Service Solutions

Oracle is improving the experience that customers have every time they walk into a store, by rolling out a series of new enhancements to its Oracle Retail Xstore Point-of-Service solutions.  The latest release evolves the point-of-service user experience by delivering greater mobility, improving exception management and enabling better customer engagement.

  • The new Oracle Retail Xstore Point-of-Service Workstation solution combines hardware and software to help store associates speed transactions via mobile point-of-service or at checkout.
  • Native integration between Oracle Retail Xstore Point-of-Service and Oracle Retail Merchandising solutions enable retailers to streamline inventory, fulfillment and planning by ensuring that merchandising and store solutions share the same merchandise hierarchy, item attributes, tag and label information, inventory and transaction information, and price management information.  Oracle Retail Xstore Point-of-Service is also integrated with Oracle Retail Merchandising System, Oracle Retail Price Management and Oracle Retail Sales Audit.
Retailers Gain Mobility, Insight and Efficiency with New Oracle Solutions

Oracle is delivering new best-in-class retail analytics and optimization technology throughout Oracle Retail Release 15, providing retailers greater insight and efficiencies. New Oracle algorithms cut in half the time required to manually complete the exception-based processes that typify retailers’ complex invoice matching process by automating up to 90 percent of invoice matches. 

Across its entire offering, Oracle has made its solutions more mobile and designed its user interface to appeal to anyone already accustomed to common search and reporting tools as well as standard Internet browser applications.

Contact Info Greg Lunsford
Oracle
+1.650.506.6523
greg.lunsford@oracle.com Mary Ellen Amodeo
Amodeo Associates
+1.612.963.5797
meamodeo@gmail.com About Oracle

Oracle offers a comprehensive and fully integrated stack of cloud applications and platform services. For more information about Oracle (NYSE:ORCL), visit www.oracle.com.

Trademarks

Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.

Safe Harbor

The following is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into any contract. It is not a commitment to deliver any material, code, or functionality, and should not be relied upon in making purchasing decisions. The development, release, and timing of any features or functionality described for Oracle's products remains at the sole discretion of Oracle Corporation. 

Learn More Oracle V15 WebinarJoin the Oracle Retail Community Talk to a Press Contact

Greg Lunsford

  • +1.650.506.6523

Mary Ellen Amodeo

  • +1.612.963.5797

Follow

Follow Oracle Corporate

Categories: Database, Vendor

Preview the newest ODBC SQL Server Driver for Windows and Linux

We are pleased to announce the community technology preview of Microsoft ODBC Driver 13 for SQL Server on Windows and Linux, supporting Ubuntu, RedHat and SUSE distributions! The updated driver provides robust data access to Microsoft SQL Server and Microsoft Azure SQL Database via ODBC on Windows and Linux platforms.

Always Encrypted for Windows and Linux

You can now use Always Encrypted with the Microsoft ODBC Driver on Linux and Windows. Always Encrypted is a new SQL Server 2016 and Azure SQL Database security feature that can help prevent sensitive data from being seen in plaintext in a SQL Server instance. It lets you transparently encrypt the data in the application, so that SQL Server will only handle the encrypted data and not plaintext values. Even if the SQL instance or the host machine is compromised, an attacker gets ciphertext of the sensitive data. In order to use the Always Encrypted feature, you have to use a supported driver such as ADO.NET or the ODBC 13 Driver for SQL Server Preview to encrypt the plain text data then store the encrypted data inside SQL Server 2016 CTP2 and above or Azure SQL Database. Similarly, you will use a capable driver like the new ODBC driver or ADO.NET to decrypt the data.

Internationalized Domain Names for Windows

Internationalized Domain Names (IDNs) allow your web server to use Unicode characters for server name, enabling support for more languages. Using the new Microsoft ODBC Driver 13 for SQL Server on Windows Preview, you can convert a Unicode serverName to ASCII compatible encoding (Punycode) when required during a connection. This conversion is enabled by setting the property serverNameAsACE to true. Otherwise, if the DNS service is configured to allow the use of Unicode characters, use the default serverNameAsACE property value of false.

Linux ODBC drivers add Ubuntu support

The preview ODBC drivers for Linux now supports Ubuntu, RedHat and SUSE. This is Microsoft’s first ODBC Driver for SQL Server release supporting Ubuntu. You can now enjoy enterprise level support while connecting to SQL Server from Ubuntu. It also updates the drivers to unixODBC driver manager 2.3.1 support.

Learn more

The ODBC driver is part of SQL Server and the Microsoft Data Platform’s wider interoperability program, with drivers for PHP 5.6, Node.js, JDBC, and ADO.NET already available. Look for more features in coming releases as we continue to build out support for Linux in our ODBC driver.

We invite you to explore the latest the Microsoft Data Platform has to offer via a trial of Microsoft Azure SQL Database or by trying the new SQL Server 2016 CTP.

For more information see documentation on the Microsoft Developer Network.

Questions? Join the discussion of the new driver capabilities at MSDN and stackoverflow. If you run into an issue or would like to make a suggestion, let us know via Connect.

Categories: Database

Effortlessly Analyze Data History Using Temporal Tables

Real data sources are never static; critical business information changes over time and important decisions often rely on insights that analysts get from evolving data.

Users who track data history aim to answer fundamental questions: How did data look in a specific point in time in the past (yesterday, a month ago, a year ago, etc.)?, what changes have been made?, and when and what were the dominant trends in a specific period of time? Without proper support in the database, however, questions like these have never been easy to answer.

Temporal Tables are a new feature in SQL Server 2016, designed to be the ultimate productivity tool for developing or migrating applications that provide insights from historical data. Temporal Tables allow you to track the full history of changes without additional code and let you focus your data analysis on a specific point in time in a very simple and efficient way.

Getting started with Temporal Tables

Temporal Tables keep data closely related to time context so that stored facts can be interpreted as valid only within the specific period. They are also referred to as a system-versioned temporal table, because the period of validity for each row is automatically maintained by the system (i.e. database engine).

Depending on a scenario, you can either create new system-versioned temporal tables or extend existing ones with temporal attributes.

When the data in a temporal table is modified, version history is built automatically and transparently without additional action from the end user or application. This property makes Temporal Tables an obvious choice when adding data auditing to existing applications because it is not necessary to change how they interact with the database in order to modify or read the latest (actual) state of data.

With Temporal Tables, historical data is just one query away, which gives you very efficient point-in-time analysis without additional latency.

The diagram below depicts a typical user workflow with Temporal Tables:

To get started with Temporal Tables, download the AdventureWorks Database for SQL Server 2016 CTP3 with script samples and follow the instructions in the folder “Temporal."

To learn the different ways to create Temporal Tables and work with them in your applications, check out Getting started with system-versioned Temporal Tables at MSDN.

Typical scenarios

Temporal Tables can be used in a wide set of scenarios when you need to track data history. We strongly recommend it in the following use cases due to the huge productivity benefits:

  • Data audit: Turn on system versioning for tables that store critical information where you need to track changes, when, and by which user. Using Temporal Tables will allow you to perform a data audit transparently (without breaking existing applications) and to perform efficient data forensics at any point in time.
  • Time travel: Use Temporal Tables to reconstruct the state of your data at any point in time in the past, compare data from multiple moments side-by-side, and perform trend analysis for important business indicators. Utilize new temporal querying capabilities to review business reports as they were in the past without changing the report. Zoom in for business analysis at a specific point in time naturally using the built-in querying capabilities.
  • Slowly changing dimensions: Keep a history of dimension attribute changes in the data warehouse model without additional ETL code.
  • Repair record-level corruptions: Perform the finest level of data repair in case of accidental corruption made either by a human or application. The affected portion of data can be restored to “last good known state” very precisely, without dealing with backups and introducing downtime to your application.

For detailed descriptions of supported scenarios, refer to Temporal Table usage scenarios on MSDN.

Effectiveness of using Temporal Tables

Temporal Tables are a huge productivity booster for SQL Server developers because they simplify every phase in the data lifecycle, from the object creation through schema evolution, data modification, data analysis to security. The image below summarizes all benefits for users.

  1. Schema maintenance: It is very easy to create new Temporal Tables or extend existing non-Temporal Tables to become temporal system-versioned. SQL Server 2016 automatically creates an accompanying history table applying smart defaults to optimize for typical scenarios (change tracking, temporal querying, data cleanup, etc.). Period columns can be added to existing tables as hidden, which enables you to perform data audit and time travel without breaking existing applications. SQL Server 2016 also allows you to evolve table schema the same way you do with any non-Temporal Table (using ALTER TABLE from Transact-SQL scripts or SQL Server development tools for visual schema editing).
  2. History tracking is completely transparent to running workloads. While users and applications modify the data in the regular way, history is automatically built behind the scenes.
  3. Data analysis: SQL Server 2016 allows you to perform complex time travel querying very easily, using the FOR SYSTEM_TIME clause and hiding all complexity of matching current and historical data from you. The power of temporal querying becomes apparent in combination with views, especially in scenarios when you need to query complex database models including multiple Temporal Tables with foreign key relationships “as of” any point in time in the past.
  4. Data protection: Temporal Tables ensure immutability of historical data, even for users with edit permissions, which makes Temporal Tables a very good candidate for data audit scenarios. Even system administrators must make a conscious choice to remove system-versioning before making any change to historical data. However, all changes to Temporal Table configuration, including state-of-system versioning, can be easily tracked using SQL Server Audit. For more details, check out Temporal Table Security on MSDN.
How Temporal Tables work

A Temporal Table is a configuration that includes two user tables: current, keeping actual (latest) row versions, and history, storing previous versions for every row in case it has ever changed or been deleted. Any operation that inserts new rows affects only the current table during which the database automatically records the period start time based on the begin time of the transaction. Additional overhead of the insert operation is negligible, compared to inserts for non-Temporal Tables.

During the update or delete operations, previous row versions are automatically moved to the history table with the end of period updated to the begin time of the transaction that initiated change in the current table. That operation is referred to as “system-versioning” and it occurs as part of the same transaction that modifies the current table.

System-versioning adds overhead to update/delete operations compared to the non-temporal case because it actually performs two operations. However, overhead is less than in any custom solution users build for temporal data handling (triggers, stored procedures, and application logic).

Querying of current data does not differ from the non-temporal case and it does not introduce any performance overhead either. There are several modes of querying historical data with the FOR SYSTEM_TIME clause. During the processing of query with FOR SYSTEM_TIME, SQL Server transparently adds a union between the current and history tables and propagates temporal predicates to filter data based on their period of validity.

The size of the history table depends on the application DML pattern (insert vs. update/delete) and period of time during which system-versioning was active, but in general, Temporal Tables tend to increase database size more than regular tables. Therefore, it is strongly recommended that you plan for retention of historical data and perform periodic data clean-ups.

Learn more

Temporal Tables on MSDN

Getting started with Temporal Tables

Temporal Tables on Channel 9

Temporal Table usage scenarios

Manage retention of historical data in system-versioned Temporal Tables

System-versioned Temporal Tables with memory-optimized tables

See the other posts in the SQL Server 2016 blogging series.

Categories: Database

JSON in SQL Server 2016: Part 4 of 4

Exporting Data as JSON - FOR JSON

In this final post of our four-part JSON series, we showcase the ability to format query results as JSON text using the FOR JSON clause. If you are familiar with the FOR XML clause, you will easily understand FOR JSON.

When you add the FOR JSON clause at the end of a SQL SELECT query, SQL Server will take the results, format them as JSON text, and return them to the client. Every row will be formatted as one JSON object, values in cells of the result set will be generated as values of JSON objects, and column names or aliases will be used as key names. We have two kinds of FOR JSON clauses:

  • FOR JSON PATH: Enables you to define the structure of an output JSON using the column names/aliases. If you put dot-separated names in the column aliases, JSON properties will follow the naming convention. This feature is similar to FOR XML PATH where you can use slash separated paths.
  • FOR JSON AUTO Automatically creates nested JSON sub arrays based on the table hierarchy used in the query. Again, this is similar to FOR XML AUTO.

JSON text that is generated with a FOR JSON clause can be transformed back to the relational form using OPENJSON.

Conclusion

JSON functions in SQL Server enable you to query and analyze JSON data as well as transform JSON to relational domains, and relational data to JSON. They allow you to integrate SQL Server with external systems that produce or consume JSON data without additional transformations in the application layer.

SQL Server also provides a hybrid storage model where you can combine relational data and JSON. This model enables you to make trade-offs between high performance data access and flexibility/rapid developments. You can use the same indexing techniques both on standard columns and values in JSON text.

The hybrid model retains all the benefits of the SQL Server engine with fully powerful query language and ACID transactions. It also boasts well-known management and security models, several features that work with JSON functions, and a wide ecosystem of tools compatible with SQL Server.

Check out the other posts in this four-part series in the links below, or learn more in the SQL Server 2016 blogging series.

JSON in SQL Server 2016: Part 1 of 4

JSON in SQL Server 2016: Part 2 of 4

JSON in SQL Server 2016: Part 3 of 4

Categories: Database

JSON in SQL Server 2016: Part 3 of 4

Transform JSON Text to Relational Table - OPENJSON

OPENJSON is a table-value function (TVF) that looks into JSON text, locates an array of JSON objects, iterates through the elements of the array, and for each element returns one row in the output result.

In the example above, we can specify where to locate the JSON array that should be opened (i.e. in the $.Orders path), what column should be returned as result, and where in the JSON objects values can be located that will be returned as cells.

OPENJSON can be used in any query that works with data. As an example, we can transform a JSON array in the @orders variable into a set of rows and insert them into a standard table:

INSERT INTO Orders(Number, Date, Customer, Quantity)
SELECT Number, Date, Customer, Quantity
 OPENJSON (@orders)
 WITH (
        Number varchar(200),
        Date datetime,
        Customer varchar(200),
        Quantity int
 ) AS OrdersArray

Four columns in the result set that is returned by OPENJSON are defined in the WITH clause. OPENJSON will try to find the properties Number, Date, Customer, and Quantity in each JSON object and convert their values into columns in the result set. By default, NULL will be returned if the property is not found. The assumption in the query above is that the @orders variable contains the following JSON array:

'[
   {"Number":1, "Date": "8/10/2012", "Customer": "Adventure works", "Quantity": 1200},
   {"Number":4, "Date": "5/11/2012", "Customer": "Adventure works", "Quantity": 100},
   {"Number":6, "Date": "1/3/2012", "Customer": "Adventure works", "Quantity": 250},
   {"Number":8, "Date": "12/7/2012", "Customer": "Adventure works", "Quantity": 2200}
]'

As you can see, the transformation from a JSON text to a relational form is simple. You just need to specify column names and types and OPENJSON will find properties in JSON that match these columns. In this example, plain JSON is used; however, OPENJSON can handle any nested/hierarchical structure of JSON objects.

Also, OPENJSON can be used to combine relational and JSON data in the same query. If we assume that the JSON array shown in the previous example is stored in the Orders column, the following query can combine the columns and JSON fields:

SELECT Id, FirstName, LastName, Number, Date, Customer, Quantity
 FROM Person
    CROSS APPLY OPENJSON (OrdersJson)
                            WITH (
                                        Number varchar(200),
                                        Date datetime,
                                        Customer varchar(200),
                                        Quantity int ) AS OrdersArray

OPENJSON will open an array in each cell and return one row for each JSON object (i.e. element) in the array. CROSS APPLY OPENJSON syntax is used to “join” rows in the table with the child inner table that will be materialized from a JSON array in the JSON cell.

Indexing JSON data

Although values in JSON are formatted as text, you can index them the same way as any other values in table columns. You can use either standard NON CLUSTERED or full text search indexes.

If you want to create an index on some JSON property that is frequently used in queries, you can create a non-persisted computed column that references the value and creates a standard index on that column. In the following example, we will optimize queries that filter rows using the $.Company property in the InfoJSON column:

ALTER TABLE Person
ADD vCompany AS JSON_VALUE(InfoJSON, '$.Company')

CREATE INDEX idx_Person_1
    ON Person(vCompany)

As you can see, SQL Server provides a hybrid model where you can put values from JSON either in key or included columns and uses both JSON values and standard columns in the same index.

Since JSON is regular text, you can use standard full text index. Full text indexes can be created on arrays of values. You create a full text index on a column that contains a JSON array, or you can create a computed column that references some array in the existing column and create a full text search index on that column:

ALTER TABLE Person
ADD vEmailAddresses AS JSON_QUERY(InfoJSON, '$.Contact.Emails')

CREATE FULLTEXT INDEX ON Person(vEmailAddresses)
    KEY INDEX PK_Person_ID ON jsonFullTextCatalog;

Full text index is useful if you need to optimize queries that try to find rows where the JSON array contains some value:

SELECT PersonID, FirstName,LastName,vEmailAddresses
FROM Person
WHERE CONTAINS(vEmailAddresses, 'john@mail.microsoft.com')

This query will return Person rows where the email array contains the value ‘john@mail.microsoft.com.’ Full text index doesn’t have any special rules for parsing JSON. It will split a JSON array using separators (i.e. double quotes, commas, brackets) and index values in an array. Full text index is applicable on arrays of numbers and simple string values. If you have more complex objects in a JSON array, a full text index cannot be directly applied because the system does not know the difference between keys and values.

As you can see, the same indexing methods are used both on JSON values and relational columns.

Check out the other posts in this four-part series in the links below (as they become available), or learn more in the SQL Server 2016 blogging series.

JSON in SQL Server 2016: Part 1 of 4

JSON in SQL Server 2016: Part 2 of 4

JSON in SQL Server 2016: Part 4 of 4

Categories: Database

Actian and Attivio OEM Agreement Accelerates Big Data Business Value by Integrating Big Content

Actian Corporation (Ingres) Press Releases - Wed, 11/13/2013 - 14:03
Attivio AIE® integration into the Actian ParAccel Big Data Analytics Platform delivers unmatched performance and insights at fraction of big stack vendor cost

Newton, MA and Redwood City, CA – November 13, 2013 Attivio, creator of the award-winning Active Intelligence Engine (AIE®) and Actian, a leader in next-generation big data analytics, today announced a strategic OEM agreement. The agreement gives Actian customers the power to integrate and correlate human-generated, unstructured content sources with their structured data analytics to dramatically increase the return on their information assets.

Actian’s ParAccel Big Data Analytics Platform accelerates analytic performance and enables rapid deployment of highly targeted analytics for critical initiatives including customer segmentation, fraud prevention, risk avoidance and digital marketing optimization. The addition of Attivio AIE’s ability to include Big Content sources – social media, blog posts, email, research documents, SEC filings, insurance reports, legal depositions, open-ended survey responses and more – completes the Big Data analytics picture to greatly improve its impact and value.

With its patented query-time JOIN technology, Attivio AIE delivers unrivaled access to information from internal and external human-created information, so customers can integrate, correlate, access and analyze all of their assets, regardless of format. As a result, business users gain new, real-time business insights that help build revenue, cut costs, increase competitiveness and manage risk — insights that might otherwise go undiscovered.

“Big Content has become a vital piece in the Big Data puzzle,” said David Schubmehl, Research Director, IDC. “The majority of enterprise information created today is human-generated, but legacy systems have traditionally required processing structured data and unstructured content separately. The addition of Attivio AIE to Actian ParAccel provides an extremely cost-effective option that delivers impressive performance and value.”

“Through this alliance with Attivio, our customers now have access to all internal and external sources of unstructured content, solidifying our leadership position in the Big Data space,” said Fred Gallagher, VP Strategic Business for Actian. “With Attivio, we are giving customers more options and making this technology accessible to companies who recognize the value of innovation that is simply not available from legacy big stack vendors who typically impose expensive, long-term contracts, inflexible technology and long deployment times.” “Actian ParAccel delivers high-performance analytics on massive amounts of data,” said Sid Probstein, CTO at Attivio. “Being able to access and analyze outside sources from social media to any and all types of human-created content provides a complete and detailed view of customers and their requirements. It advances the entire field of BI and analytics.”

For more information about the Actian alliance and the Attivio Partner Network, please contact partners@attivio.com.

About Attivio

Attivio’s unified information access platform, the Active Intelligence Engine® (AIE®), redefines the business impact of our customers’ information assets, so they can quickly seize opportunities, solve critical challenges and fulfill their strategic vision.

Attivio integrates and correlates disparate silos of structured data and unstructured content in ways never before possible. Offering both intuitive search capabilities and the power of SQL, AIE seamlessly integrates with existing BI and big data tools to reveal insight that matters, through the access method that best suits each user’s technical skills and priorities. Please visit us at www.attivio.com.

Attivio and Active Intelligence Engine are trademarks of Attivio, Inc. All other names are trademarks of their respective companies.

About Actian: Take Action on Big Data

Actian powers the action-driven enterprise in the Age of Data, delivering rapid time to analytic value with a modular approach that connects data assets behind the scenes, enables unconstrained analytics and scales almost without limits. The Actian DataCloud Integration and ParAccel Big Data Analytics Platforms help organizations leverage data assets for competitive advantage, delivering highly parallelized software that fully exploits modern chip, core and cluster infrastructures. Its scalable solutions deliver powerful business results for industry innovators. Actian serves tens of thousands of users worldwide and is headquartered in Redwood City, California. Stay connected with Actian Corporation on Facebook, Twitter and LinkedIn.

Actian, DataCloud, ParAccel, Action Apps, Ingres, RushAnalytics, Versant and Vectorwise are trademarks of Actian Corporation and its subsidiaries. All other trademarks, trade names, service marks, and logos referenced herein belong to their respective companies.

Categories: Database, Vendor

Actian to Present at the NASSCOM Product Conclave 2013

Actian Corporation (Ingres) Press Releases - Mon, 10/28/2013 - 19:00

Redwood City, Calif. – October 28, 2013 — – Steve Shine, CEO and President of Actian Corporation (“Actian”), a next-generation leader in big data analytics, will present during the NASSCOM® Product Conclave 2013 – The Community Meet-up of Doers Who Share! event, held Oct. 28-30 at Vivanta by Taj – Yeshwantpur in Bangalore, India.

During the Opportunities in Big Data session, which takes place Oct. 30 at 2:00-2:30 p.m. IST in the Aura room, Shine will discuss the various opportunities presented by big data, as well as new areas of innovation for startups trying to effect meaningful competitive differentiation through analytics on big data.

Actian has a strong presence in India, helping companies like Future Group, Mahindra Comviva and ManageEngine, a division of Zoho Corporation, harness the power of big data analytics to predict business trends, increase customer retention and gain a competitive advantage.

For more information on the NASSCOM Product Conclave 2013, please visit http://productconclave.nasscom.in/.

About Actian: Take Action on Big Data

Actian powers the action-driven enterprise in the Age of Data, delivering rapid time to analytic value with a modular approach that connects data assets behind the scenes, enables unconstrained analytics and scales almost without limits. The Actian DataCloud Integration and ParAccel Big Data Analytics Platforms help organizations leverage data assets for competitive advantage, delivering highly parallelized software that fully exploits modern chip, core and cluster infrastructures. Its scalable solutions deliver powerful business results for industry innovators. Actian serves tens of thousands of users worldwide and is headquartered in Redwood City, California. Stay connected with Actian Corporation on Facebook, Twitter and LinkedIn.

Actian, DataCloud, ParAccel, Action Apps, Ingres, RushAnalytics, Versant and Vectorwise are trademarks of Actian Corporation and its subsidiaries. All other trademarks, trade names, service marks, and logos referenced herein belong to their respective companies.

Categories: Database, Vendor

Actian Teams with Hortonworks to Deliver Reference Architecture that Addresses Most Vexing Hadoop Challenges for Enterprises

Actian Corporation (Ingres) Press Releases - Fri, 10/25/2013 - 03:51

Sand Hill research pinpoints Hadoop adoption hurdles at each phase of maturity cycle

Redwood City, Calif. – October 24, 2013 Actian Corporation (“Actian”), a next-generation leader in big data analytics, has further integrated its ParAccel Big Data Analytics Platform on the Hortonworks Data Platform to address top Hadoop adoption hurdles of data extraction challenges, design-time friction and poor run-time performance.

The deep integration of Actian’s ParAccel Big Data Analytics Platform on the Hortonworks Data Platform enables high-performance analytics without constraints, simplifies the delivery of data services for entire ecosystems of users, and significantly lowers the total cost of ownership for the modern data platform. By utilizing the new Hortonworks/Actian reference architecture, users can experience quicker time to analytic value by:

  • Easily moving data to and from Hadoop: Users can easily extract data from Hadoop and enrich it seamlessly from within the ParAccel Platform.
  • Eliminating design-time friction: ParAccel Dataflow’s easy-to-use drag-and-drop development capabilities negate the need for complex MapReduce coding or parallel programming.
  • Improving performance: Actian is Hortonworks Data Platform 2.0 and YARN certified, giving users the ability to leverage ParAccel Dataflow for the enhanced performance of pipeline parallelism alongside their existing MapReduce jobs.

Sand Hill Group recently surveyed Hadoop users, primarily data architects and CIOs, to assess their level of maturity in Hadoop adoption and the top challenges they face. For 46.7 percent of respondents, the top challenge was knowledge and experience with the Hadoop platform, while 20.7 percent cited availability of Hadoop and Big Data skills and 6.7 percent struggled with the amount of technology development and engineering required to implement a Hadoop-based solution. Actian’s and Hortonworks’ joint reference architecture tackles these key challenges head-on.

“Hadoop is the clear winner for enterprises that seek to drive transformational, actionable business insights from their big data,” said Mike Hoskins, chief technology officer of Actian Corporation. “For all its power, Hadoop still presents significant barriers to enterprise-scale adoption. The combined Actian-Hortonworks solution overcomes challenges around MapReduce skills shortages, design-time friction and run-time performance to elevate Hadoop to true enterprise readiness. Actian has many global clients delivering groundbreaking big data projects. For nearly all of these, Hadoop is a central building block. The collaboration between Hortonworks and Actian provides a huge boost for performance, productivity and access to rich big data functionality, which helps increase adoption of Hadoop. For our mutual customers it boils down to faster time to value – in the end, that’s what really counts.”

“We have collaborated closely with Actian on the reference architecture for the ParAccel Big Data Analytics Platform and the Hortonworks Data Platform,” said Ari Zilka, chief technology officer of Hortonworks. “When used together, these products allow users to rapidly and effectively deploy high-performance Hadoop applications for scenarios ranging from archiving, ETL and operational applications to advanced analytics, including predictive analytics. By working together we are excited to help forward-thinking architects and CIOs successfully design and implement incredibly high-performing, hard-hitting big data applications with proven best practices, robust and secure architecture, and unmatched technology capabilities.”

“Design and implementation of next-generation big data integration processes can be a daunting leap for BI professionals who are used to classic ETL technology,” said Steve Miller, president and co-founder of OpenBI. “Actian’s approach – providing a visual data programming technology that executes natively in cluster – will increase productivity while opening up big data analytics programming to a much broader set of professionals.”

“Our research findings offer compelling insights into friction points on the march to harnessing Hadoop to unlock big data analytics value for the enterprise,” says M.R. Rangaswami, co-founder and managing director of Sand Hill Group. "With more than 35 percent of respondents indicating that the results of their Hadoop/Big Data initiatives were less than they expected, the market is ripe for more easy-to-use, easy-to-implement solutions on Hadoop such as Actian's analytics and ETL offerings.”

Hoskins and Zilka will jointly present at Strata + Hadoop World 2013 on Tuesday, October 29, 2013 at 1:45 p.m. EDT. In addition, both Actian and Hortonworks will provide hands-on demonstrations of the reference architecture at their Strata booths, #108 and #306, respectively.

About Actian: Take Action on Big Data
Actian powers the action-driven enterprise in the Age of Data, delivering rapid time to analytic value with a modular approach that connects data assets behind the scenes, enables unconstrained analytics and scales almost without limits. The Actian DataCloud integration and ParAccel Big Data Analytics Platforms help organizations leverage data assets for competitive advantage, delivering highly parallelized software that fully exploits modern chip, core and cluster infrastructures. Actian serves tens of thousands of users worldwide and is headquartered in Redwood City, Calif. with offices in Austin, New York, London, Paris, Frankfurt, Amsterdam, New Delhi and Melbourne. Stay connected with Actian Corporation on Facebook, Twitter and LinkedIn.

Actian, Actian DataCloud, ParAccel, Action Apps, Ingres, Versant and Vectorwise are trademarks of Actian Corporation and its subsidiaries. All other trademarks, trade names, service marks, and logos referenced herein belong to their respective companies.

# # #

Categories: Database, Vendor

Actian Appoints New CMO and SVP of Business Development to Accelerate the Transformation of the Big Data Marketplace

Actian Corporation (Ingres) Press Releases - Thu, 10/24/2013 - 19:00

Ashish Gupta, Former Microsoft and Vidyo executive, is poised to redefine the data analytics marketplace by creating new opportunities not addressed by aging stack players

Redwood City, Calif. – October 24, 2013 — – Actian Corporation (“Actian”), a next-generation leader in big data analytics and cloud integration, has appointed Ashish Gupta as chief marketing officer and senior vice president of business development.

“We have very systematically evolved our business so that we have a very powerful engine for the market’s transition to the Age of Data,” said Steve Shine, chief executive officer of Actian Corporation. “Ashish Gupta brings the go-to-market experience to ensure that we build our base in big data analytics and cloud integration. More importantly, he has the strategic vision to work with the team to entirely redefine these markets as we leapfrog legacy stack players.”

Gupta brings a passion and track record for challenging established companies saddled with legacy approaches, often caught in the Innovator’s Dilemma, by exploiting disruptive technologies and creative marketing strategies to rapidly scale the new entrant’s business and open new markets not addressed by incumbents. He joins Actian from Vidyo, a video conferencing provider, where he served as chief marketing officer and senior vice president of corporate development. Under his leadership, Vidyo experienced significant growth, formed key strategic partnerships and built a leading brand in the industry – challenging the leaders of a 30-year-old industry. Prior to Vidyo, Gupta held various executive roles within the Microsoft Office Division where he led marketing, sales and business development teams for Microsoft’s Unified Communications Group, competing with established PBX vendors to create a leadership position for Microsoft Lync in the communications and collaboration market. Gupta has also served in executive roles at Alcatel/Genesys Telecommunications, Deloitte Consulting, Covad and Hewlett-Packard.

“Severe market disruption happens when innovative technologies materially shift an industry’s fulcrum to create new business approaches and markets,” said Gupta. “Actian’s portfolio has proven to be such a force in the industry by enabling next-generation analytical solutions and invisible integration for thousands of customers that use data to create sustainable competitive advantage. I welcome the opportunity to contribute to the company’s fast growth and market leadership.”

About Actian: Take Action on Big Data

Actian powers the action-driven enterprise in the Age of Data, delivering rapid time to analytic value with a modular approach that connects data assets behind the scenes, enables unconstrained analytics and scales almost without limits. The Actian DataCloud Integration and ParAccel Big Data Analytics Platforms help organizations leverage data assets for competitive advantage, delivering highly parallelized software that fully exploits modern chip, core and cluster infrastructures. Its scalable solutions deliver powerful business results for industry innovators. Actian serves tens of thousands of users worldwide and is headquartered in Redwood City, California. Stay connected with Actian Corporation on Facebook, Twitter and LinkedIn.

Actian, DataCloud, ParAccel, Action Apps, Ingres, RushAnalytics, Versant and Vectorwise are trademarks of Actian Corporation and its subsidiaries. All other trademarks, trade names, service marks, and logos referenced herein belong to their respective companies.

Categories: Database, Vendor

Actian to Present at the 10th Annual East Coast Technology Growth Conference

Actian Corporation (Ingres) Press Releases - Tue, 10/15/2013 - 02:41

Redwood City, Calif. – October 14, 2013 — – John Santaferraro, vice president of solutions and product marketing at Actian Corporation(“Actian”), a next-generation leader in big data analytics, will present during the AGC Partners 10th Annual East Coast Technology Growth Conference at the Westin Copley Place in Boston on Tuesday, October 15, 2013.

During the event, Santaferraro will participate in a panel focused on Big Data Analytics moderated by Jeff Kelly, Principal Research Coordinator of the Wikibon Project. Santaferraro and fellow panelists Kathy McGroddy, VP of Business Development of IBM; Louis Brun, Senior VP of Product Strategy of Guavus; Jit Saxena, Chairman of Netezza and Anthony Deighton, CTO of QlikView will discuss why traditional data management and business analytics tools and technologies are straining under the weight of big data. They will shed light on challenges organizations face when deriving value from their data and share new approaches in helping enterprises gain actionable insights through next-generation big data analytics.

Visit actian.com to learn more about Actian’s next-generation approach of enabling organizations to predict the future, prescribe the next best action, prevent damage to their business and discover hidden risks and opportunities through use of the ParAccel Big Data Analytics Platform.

About Actian: Take Action on Big Data

Actian powers the action-driven enterprise in the Age of Data, delivering rapid time to analytic value with a modular approach that connects data assets behind the scenes, enables unconstrained analytics and scales almost without limits. The Actian DataCloud Integration and ParAccel Big Data Analytics Platforms help organizations leverage data assets for competitive advantage, delivering highly parallelized software that fully exploits modern chip, core and cluster infrastructures. Its scalable solutions deliver powerful business results for industry innovators. Actian serves tens of thousands of users worldwide and is headquartered in Redwood City, California. Stay connected with Actian Corporation on Facebook, Twitter and LinkedIn.

Actian, DataCloud, ParAccel, Action Apps, Ingres, RushAnalytics, Versant and Vectorwise are trademarks of Actian Corporation and its subsidiaries. All other trademarks, trade names, service marks, and logos referenced herein belong to their respective companies.

Categories: Database, Vendor

Actian Integrates ParAccel Big Data Analytics Platform to Speed Time to Analytic Value

Actian Corporation (Ingres) Press Releases - Fri, 10/11/2013 - 21:14

Platform combines ParAccel Big Data Analytics Platform’s high-performance analytical capabilities with Hadoop’s data management capabilities to democratize big data analytics

Redwood City, Calif. – October 11, 2013 Actian Corporation (“Actian”), a next-generation leader in big data analytics, announced the release of new ParAccel Dataflow Operators that integrate the recently announced ParAccel Big Data Analytics Platform, further unifying its technology offerings. This new innovation ties together the data management capabilities of Hadoop with the high-performance analytic processing of the ParAccel Big Data Analytic Platform, enabling organizations to achieve better price performance and quicker time to analytic value.

The ParAccel Big Data Analytics Platform provides a single, visual interface to load, prepare, enrich and analyze data natively on various Apache Hadoop distributions. Users can now process data directly on Hadoop and move the data to the ParAccel Big Data Analytics Platform’s high-performance analytics engine without any MapReduce or SQL programming, rapidly increasing the rate at which business value is derived from the analytics process. Business users and analysts alike can leverage the extreme performance of the ParAccel Big Data Analytics Platform for their predictive analytics initiatives.

“In the Age of Data, organizations need next-generation technologies that rapidly turn big-data investments into value-creation engines. To achieve this, different data platforms must work together to meet the mixed workload demands that result from the convergence of emerging data and new analytics,” said Steve Shine, chief executive officer of Actian Corporation. “Actian has quickly integrated its leading-edge technology assets to create a cooperative analytic processing environment that accelerates the speed at which data flows from data management to high-value analytics.”

The ParAccel Dataflow Operators enable data to persist within Actian’s massively parallel analytic processing engine or the single-node version used for accelerated business intelligence or analytic applications. When used in conjunction with Hadoop’s superior data management capabilities, the platform gives business users the ability to conduct analytics leveraging best-of-breed point applications for enterprise-class deployments.

“Platforms that allow workloads to gravitate to the engine best-suited to handle specific kinds of processing will consistently lower the cost of computing,” said Shawn Rogers, vice president of Enterprise Management Associates. “Hadoop is optimal for preparing and enriching data, as well as search and discovery analytics. Analytic platforms are best-suited for running high-performance, lower latency analytics at any scale. Actian’s next-generation, shared processing approach is consistent with the way companies should architect systems to leverage both technologies.”

This technology integration comes only months removed from Actian’s announcement of leveraging recently acquired companies ParAccel, Pervasive Software and Versant Corporation to deliver the ParAccel Big Data Analytics and Actian DataCloud Platforms. “You are seeing the new Actian in action,” said Shine. “We are a big data innovator with the agility of a small Silicon Valley startup and the strength and staying power of a global stalwart.”

About Actian: Take Action on Big Data

Actian powers the action-driven enterprise in the Age of Data, delivering rapid time to analytic value with a modular approach that connects data assets behind the scenes, enables unconstrained analytics and scales almost without limits. The Actian DataCloud Integration and ParAccel Big Data Analytics Platforms help organizations leverage data assets for competitive advantage, delivering highly parallelized software that fully exploits modern chip, core and cluster infrastructures. Its scalable solutions deliver powerful business results for industry innovators. Actian serves tens of thousands of users worldwide and is headquartered in Redwood City, California. Stay connected with Actian Corporation on Facebook, Twitter and LinkedIn.

Actian, DataCloud, ParAccel, Action Apps, Ingres, RushAnalytics, Versant and Vectorwise are trademarks of Actian Corporation and its subsidiaries. All other trademarks, trade names, service marks, and logos referenced herein belong to their respective companies.

Categories: Database, Vendor

A Great Day for Membase

NorthScale Blog - Tue, 10/05/2010 - 01:14

Whenever I talk about Membase with candidates, employees, or friends, I feel more and more excited about what we are building and how it is going to impact the industry. Each discussion validates my belief that what we do *is* unique and a game changer.

Just today, we had two important “wins,” one from a prospect who evaluated our technology against other NoSQL databases and chose Membase. I can’t talk much about it yet, but this is an amazing win. The second is the fact that IDC chose us as an innovative company to watch. Great day!

Every morning when I look at my calendar, I find myself looking forward several things. At the top of the list is the meetings in which I am going to discuss Membase technology, meet smart people, and demo a data management solution they can get excited about. I also look forward to the end of each day to see what improvements are in the latest build that make it even better for Membase users. It’s fun being in a position where people are hungry to learn about what you do and how you do it.

After five years in a big company I now remember how much I love being in a startup: to be able to move quickly, change direction fast when needed, develop features in days that in other environments would take weeks or even months, wear multiple hats, and most importantly, be close to the customer. I like building meaningful systems that solve real problems. This company is an amazing place to be and it is getting better every day.

And, by the way, we are *always* looking for great people to join us. If you’re one of them, just shoot me an email.

Categories: Architecture, Database

Membase Recognized by IDC

NorthScale Blog - Mon, 10/04/2010 - 17:52

Winning awards is always fun. Over the years, companies I’ve been part of have won their fair share. But not all awards are created equal. Some definitely carry more weight than others, and I put the IDC Innovative Company to Watch award in this category. The fact that IDC does extensive research on the markets they address, talks regularly to a broad set of vendors and customers, and has a rigorous process for award selection all brings great credibility to the award. The award certainly has great meaning for us and I suspect this is also true for organizations who are thinking through what database to use for their next project.

As a small company it’s always a challenge to get the word out about your products and this is particularly true when you’re in a space like NoSQL where there are lots of competing technologies. Membase wasn’t one of the first NoSQL products in the market, so it’s encouraging that our innovative work and early customer success is being recognized so quickly. We’re very proud that while IDC could have given the award to any of the many NoSQL contenders, they chose to give it to us.

While we are thrilled to be recognized as a company to watch, it is even more gratifying that IDC understands the strategic importance of this new category of databases for enterprise customers and the significant near-term opportunity (tens of millions of dollars) it represents for companies like ours. IDC notes that they are seeing “an ‘intensifying trend’ for application development to move to the Web, creating the need for back-end architectures that demand extreme speed and scale elasticity while maintaining high levels of reliability. I can second that. We’ve seen a marked increase in the interest and uptake of our software – we just hit a run-rate of 30,000 downloads a month, and judging by this heightened demand in the marketplace, customers with interactive web applications are clearly looking for alternatives to complement their relational database solutions.

We’re also excited about the range of customer interest. Yes, our customers include many Web 2.0 type companies such as social gaming and ad targeting platforms among others – but many enterprise customers are now recognizing the need for non-relational solutions on the back end. A recent InformationWeek survey indicated that 44% of IT staff in the enterprise had not yet heard of NoSQL databases. But that means that 56% have heard of NoSQL– and in fact, if our interactions with customers is any indication, many of those already have pilots underway. From financial services to retailers to media companies, we’re seeing a growing number of inquiries and engagements in the enterprise and expect those numbers to increase as the value of NoSQL becomes more widely understood among those in mainstream IT.

I’d love to hear from you, especially if you are involved with web-based application development for an enterprise. What are your plans for exploring this emerging class of data management solutions optimized to support interactive web applications?

Categories: Architecture, Database