Super User

Super User

Morbi ut sapien. Morbi arcu mauris, suscipit congue, placerat sit amet, suscipit a, ante. Donec aliquet dui ac nunc. Mauris magna quam, aliquet quis, viverra eu, fringilla eget, purus. Donec tristique pretium sem.

Thursday, 30 April 2015 17:02

A week with the Apple Watch, our verdict

Just under a week ago today the much anticipated Apple Watch arrived in our office. If there’s one thing we could all learn from Apple, it’s how to build hype around a new product launch (even the delivery man hung around to see it opened). It was (as ever) not the first device in this category, but certainly the most anticipated. So a week later, how does it compare to the Android Wear devices we already had in the office?

The first difference is something of a given with Apple, it’s a lot prettier. Particularly compared to the LG G Watch we had which was just plain ugly. For something that you wear, that’s important.  Secondly, and also something we’ve come to expect from Apple, the general interface for navigating around the watch is an improvement on Android Wear. This is largely due to the addition of the crown and button on the right of the watch that give simple scroll and select interfaces, but also the layout of the home and other screens. There’s also the built in heart monitor. Apart from the obvious applications for sport and health apps, this also means the watch knows when it’s being worn so alerts don’t fire at night when your watch is on your bedside table.

Which brings us neatly on to alerts. We’ve discussed many times in this blog that it’s alerts that really differentiate smart watches from other tech. You never miss your watch vibrating on your wrist, but nobody else is aware of it. As such they provide a more reliable yet less intrusive alerting mechanism than our smart phone. We’re interested in these devices because DataPA OpenAnalytics allows “citizen developers” to build almost any alert capability into the business process.  Combine this with a smart watch and a factory shift manager could be notified immediately, regardless of his location and surrounding noise levels, if a large order was placed that will significantly raise demand. Or a retail area manager driving between stores can be notified immediately when demand for a particular product rises suddenly across several shops he manages. Or your database administrator can be notified immediately when a database is close to filling its allotted disk space.

It’s in this regard, which we believe is critical to the real value of the smart watch, that Android Wear perhaps retains an edge over the Apple Watch. From the developers perspective the Android Wear alerting framework is much more flexible, easily allowing us to display a custom alert that the user can swipe to see the chart, then tap to open in the app on their phone. These things can be achieved on the Apple watch, but it’s more difficult to engineer and the behaviour on the watch is less predictable.

However, these are small differences and only time will tell which of these two market leaders, if either, dominate this space. It’s early days for this tech at the moment, but in our opinion more companies like ourselves will see the possibilities it offers for both business and consumers, and a few years from now smart watches could be as ubiquitous as the smart phone.

Building an analytics platform is about more than extracting data from your business application, it also needs to be transformed into information that is meaningful for the user. This is traditionally part of the process of engineering your data warehouse, deciding what data marts are required and engineering some code to transform the raw data from the business application into these more meaningful data objects. It’s an expensive and difficult process, requiring significant engineering skill and knowledge and is almost impossible to get right first time.

While this approach does have advantages, and there will always be a place for data warehouses in analytics, they can be cumbersome, expensive and a real barrier to agility. It also ignores the fact that more often than not you have already built the logic to transform data, often multiple times, in the application itself. Take one of our biggest partners Infor. The distribution industry survives on margins, so flexibility on the calculation of pricing is paramount. That’s one of the reasons Infor’s distribution system is so popular, it has a hugely flexible pricing module. This means however calculating prices is complex, and requires the application of a significant amount of business logic. That logic has already been written of course, in a stable and robust ABL module that is called from reports and screens across the application. Surely then, our analytics solution should make it simple for us to just call that logic, not force us to re-engineer and maintain it on another platform elsewhere?

We think to achieve a really successful embedded analytics solution integration of business logic at the back end is essential. Not only does it improve agility and save time and effort, but also ensures both the business application and analytics platform show the same figures, calculated as they are by the same code. That’s why we’ve engineered DataPA OpenAnalytics to natively call any ABL business logic at the back end, by calling ABL functions to add calculated values, or going further and running ABL code to receive the entire dataset. We believe in empowering our partners to unlock the hugely valuable asset they have in their robust ABL logic. Allowing them to deliver beautiful, live intelligence when and where their customer needs it.

If you would like to find out more about our partner program, please get in touch. 

A couple of weeks ago, our CEO Gary had a new heating system fitted, with the new tado° smart thermostat. It’s pretty impressive, carefully reducing the temperature to an optimal level as soon as the last person has left home and calculating how to pre-warm the home most efficiently for your return. It also monitors the weather and claims to cut overall heating costs by 31%. Even more interestingly, tado° announced earlier this year an open API. This got us thinking. 

Our developers are not known for their ability to focus on the mundane when a new bit of kit is available in the office, so true to form a couple of them dropped what they were doing and started hunting the web for details on how to access the API. A little bit of fiddling about at the backend and they had it hooked up to DataPA OpenAnalytics. Now we can all keep an eye on the temperature of Gary’s house online and via the DataPA OpenAnalytics mobile app (take a look). 

This is all very entertaining, but does illustrate a more useful point. The combination of IOT devices and analytics is set to change our world in many unexpected ways. An analytics tool like DataPA OpenAnalytics will quite happily accumulate data from these devices, display it in any imaginable format and on any device, and raise alerts should any interesting threshold be met. For Gary, he can keep track of his heating system, and receive an alert should the temperature reach a value that indicates an issue. With a few more IOT devices in the home, he could combine that with information on his fuel consumption and the cost of heating fuel. Surely this will offer opportunities to further improve the efficiency of heating his home, perhaps in months to come a few companies offering a monitoring and tuning service? 

It’s clear these devices and modern analytics tools are opening up a world of opportunity for new innovation. Our errant developers have now been tasked with building an interface to make it simple for anyone to hook up DataPA OpenAnalytics to these devices and services without having to tinker with the back end. We can’t wait to see where this leads us. 

If this is the first you’ve heard about DataPA OpenAnalytics, why not find out more at datapa.com.

Our Enterprise server offers the perfect platform to pull data from an OpenEdge app and distribute it as actionable intelligence to any device across the web. Many of our customers already use it by hosting a server themselves. However, for smaller organisations, the cost required to host and maintain their own server makes it a much less compelling offering. Which is a shame, because often these customers would benefit the most from our technology. So, for some time now, we’ve been pondering on the best way to offer DataPA OpenAnalytics as a hosted service. The hard part was how to connect the largely on premise OpenEdge databases, securely and efficiently across the web.

The AppServer Internet Adaptor (AIA) was the obvious answer, but it has always been time consuming and difficult to implement, which kind of defeats the object of offering a hosted solution.

So it was, a little over a year ago, that we became pretty excited when we first heard of the Pacific AppServer. Built on Apache and Tomcat, it was billed to offer a self-contained, stand-alone AppServer module that could be easily distributed to any platform supported by OpenEdge. This sounded like the perfect solution. Roll forward a few months and DataPA OpenAnalytics now includes native support for the Pacific AppServer, and we’ve spent a few weeks testing it out. Taking the simple and secure connectivity across the web as a given, the most exiting benefit from our perspective is the performance. We saw queries on average take less than half the time they took via the AIA.

So we have to give a big thumbs up to Progress. In our opinion the Pacific AppServer could be the biggest game changer the OpenEdge platform has seen for some time. For us, it means in the coming months we can offer companies with an on premise OpenEdge app a way of securely distributing their information to any device anywhere, with very little effort or cost. We’re looking forward to watching what others make of it.

Wednesday, 17 June 2015 16:56

The truth about Analytics and Agility

Gone are the days when analytics solutions were simply a window on our business applications. With the proliferation of mobile technologies and the introduction of alerting, analytics solutions can now offer huge opportunities for change within an organisation, driving business agility and changing the working landscape forever. However, to take part in this revolution, your chosen analytics solution must have two very important attributes.

First and foremost, it must be real time. It’s no good alerting a sales representative to an up selling opportunity ten minutes after the customer has left. Which is why your analytics solution has to be able to process analytics directly against your operational data. Any solution that requires data to be moved off platform will always introduce a delay, and degrade the solutions ability to provide true real-time responsiveness.

Secondly, the solution must be truly self-service. Business agility is about adapting to change, and nothing stifles an organisations ability to adapt more than the combination of over prescriptive information technology and over-burdened IT departments. An analytics solution should allow anyone in the organisation to ask new questions and configure new alerts quickly and easily, in a single platform without having to configure several layers of technology.

At DataPA, we believe analytics should empower employees to adapt their working practice as and when they choose, driving business agility and competitiveness. If you’ve never looked at DataPA OpenAnalytics, why not come by our website and take a look.

Speak to pretty much any application vendor whose been around for some time and they’ll likely tell you their resources are focused heavily on modernization. Understandable, given how rapidly our industry has been changing over the last few years, and the constant barrage of social chatter around cloud, mobility and SaaS.

Yet look at any recent survey of CIO investment priorities and you’ll find modernization of enterprise applications is near the bottom. We think this is because disruptive change within industry is not being driven by change to core business applications (which is often very expensive and presents huge risk), but from new technologies and services integrated with these applications and other data sources.

Analytics on the other hand is pretty much consistently the top priority, and for good reason. Innovations within business analytics, such as mobile, alerts and collaboration and the emergence of new technologies such as Hadoop, MapReduce and Spark that are driving down the cost of big data analytics are opening up huge opportunities for disruptive change. As these technologies mature, and are applied to more aspects of industry the pace of this disruption is set to rise dramatically.

So we think if you’re a successful application vendor focused purely on modernization, and you’re not already addressing analytics, you’re missing a huge opportunity to drive revenue from your software. Here at DataPAwe’re dedicated to building partnerships with application vendors. We use our expertise in analytics to build industry leading technology that can be integrated seamlessly with our partners’ applications. We’d love you to join us

Workflow is a key component for any legal practice – increasing efficiencies, improving customer service and coping with evolving regulatory requirements – which is why legal case management has always been a key focus area for software developers.

The market leader in legal case management is Lexis® Visualfiles from LexisNexis. The Visualfiles “toolkit” allows organisations to expand the standard solution, adding their own entities and workflows to match any business requirement, automating even the most complex processes. Today, Visualfiles is the most widely used case and matter management system in the UK with more than 25,000 registered users in firms ranging from 5 to well over 1,000 employees.

This “ultimate flexibility” was proving to be a particular challenge for LexisNexis to provide an embedded analytics solution to their customers. For most business applications, the process of transforming raw data in the database into meaningful information for an analytics solution is the same for every customer. Everyone uses the same system so it can be easily understood, designed and implemented once for all customers. However, with Lexis Visualfiles this is not the case. The unique power of Visualfiles allows each customer to evolve their system, and by definition the underlying data set, to match their specific business needs. Whilst this provides fantastic flexibility to ensure the system evolves as the business develops, it creates a huge challenge for analytics.

However, at DataPA we understand that application developers have already designed and implemented this transformation process, otherwise the business application would be of little use. We believe developers should be able to reuse their valuable business logic assets for analytics, not be forced to reengineer them for another platform. So with DataPA OpenAnalytics, the LexisNexis development engineers were able to reuse the existing OpenEdge ABL code to deliver beautiful, accurate, live analytics embedded seamlessly into their application.

The result is the best of both worlds – a powerful business solution married to sparkling analytics – so everyone wins.

If you have equally valuable business logic developed in OpenEdge, why not talk to us today to find out how you can leverage this valuable asset to deliver beautiful, live intelligence to mobile and web.

Look up pretty much any survey on IT priorities over the last few years, and analytics is consistently number one. Not only that, the number of decision makers reporting that it is their highest priority is growing year on year.

Why? Well we think the major reason is modern analytics offers real disruptive change for organisations, rather than just the iterative efficiencies afforded by traditional BI. This change has been driven by the development from passive reporting to technology that allows users to actively discover, share and collaborate with insight about their organisation.

In our presentation at PUG Challenge EMEA in Copenhagen next week, we’ll explore these ideas further and show how some of our customers are using our technology to radically change how they do business. Here at DataPA we believe this is just the start of a hugely exciting cycle of innovation in analytics, offering huge potential for us and our partners. We’ll also discuss the innovations we’re introducing to our software in the next few months and beyond. Innovations that we’re convinced will keep us and our partners at the forefront of this revolution. We’d love you to join us.

 

 

A reoccurring point of discussion as I visit our customers is the role of traditional printed reports in business intelligence. Like most BI vendors, we have always delivered a traditional report designer as one option for visualizing the intelligence DataPA OpenAnalytics generates. However, for a good number of years we’ve concentrated our development efforts on dashboards, and ways of delivering them to an increasing array of devices. Our reasons are simple. We believe that pretty much any business function can be better supported with a live, interactive visual display of information rather than a static printed document.

So I’m always surprised at how many of our customers, even new customers, still rely heavily on our traditional report designer for their business functions. In the last few months, as I’ve visited and spoken with our customers, I’ve begun to ask why.

What’s clear is in almost all cases the decision to choose a report is based more on habit than any clear, reasoned argument. For example, a common response is the need to take a report to a meeting for discussion. But surely, the same information on a tablet, where it was possible to explore the data behind figures with colleagues would be more useful?

Today, with the proliferation of mobile devices and internet connectivity, there are very few situations where static printed documents are a better solution than visual, interactive dashboards delivered to our desktop or mobile devices. As a rule, I would suggest if there is a legal reason to share or print a document, a report is appropriate, otherwise why not consider a dashboard that can deliver live intelligence pretty much anywhere.

For our part, whilst we’ll continue to support our customers who choose reports, we’ll focus our efforts on developing dashboards that deliver live, interactive intelligence wherever and whenever it’s required.

There has been a lot of discussion lately that data lakes will transform analytics, giving us access to a huge volume of data with a variety and velocity rarely seen in the past. For those of you who don’t spend your days trawling analytics or big data blogs, the concept of a data lake is simple.

With a traditional data warehouse, the repository is heavily structured, so all the work to convert the data from its raw structure needs to be implemented before the data enters the repository. This makes it expensive to add new data sources and limits the analytics solution so only known, pre-determined questions can be asked.

Object store repositories like Hadoop are designed to store just about any data, in its raw state, with little cost or effort. As a result, it becomes cost effective for organisations to store pretty much everything, on the off chance it might be useful at a later date.

The advantage from an analytics perspective, is a data lake gives access to a much vaster, and richer source of data that facilitates data mining and data discovery in a way that is just impossible with a data warehouse. The disadvantage is the lack of structure provides real challenges for performance, data governance and providing context within which less technical users can be self-sufficient.

These challenges need to be met by those of us that design and build analytics solutions. Here at DataPA, we’ve spent years building a platform that facilitates data governance and context in a live data environment. With our technology and experience there are few companies better placed to take advantage of this new opportunity. Like most new developments, data lakes will not be a golden bullet to solve all analytics requirements. However, we do think they have a significant part to play in the future of analytics and can’t wait to see what opportunities they bring for us and our customers.

 

Page 2 of 2