Cisco Cuts Jobs As Profits Fall

Chambers-John-Cisco-290px.jpgnullCisco cuts jobs as profits fall
News
Tweet
Cisco will cut up to 6,000 jobs, approximately 8% of its global workforce, following another disappointing quarter for the networking kingpin.
Fourth-quarter revenues of $12.4bn (£7.4bn) were flat year-on-year, and net income of $2.2bn was down by 1%. Full-year revenues of $47.1bn were down by 3% on 2013, while net income of $7.9bn was down by 21.3%.
As a result, Cisco chiefs said they would embark on a limited restructuring programme as they seek to effectively respond to the numerous challenges that have dogged the supplier. Cisco has faced, among other things, intense competition and pressure from rival Huawei , and the erosion of traditional hardware markets.
Taking questions from analysts , CEO John Chambers billed the cuts as a “reallocation of resources”.
“It is an investment in our growth areas that we felt we needed to do quickly. In terms of why now, it’s the uncertainties in the market – you’re seeing a few headwinds and a lot of tailwinds. The pace to change is accelerating and we felt we had to move with tremendous speed on it,” he said.
Chambers said Cisco would reinvest its savings into growth areas such as cloud, software and security, and noted that those were often skills that the firm already had in one area of engineering that could simply be moved elsewhere.
More on Cisco
Microsoft and Cisco work together on cloud and datacentre
The firm expects to blow through about $700m in pre-tax charges relating to the redundancies, most of which it will recognise in the first quarter of its fiscal 2015.
In Europe, Cisco grew by just 2% after its business in Russia dried up, however UK growth of 18% on the commercial business and 19% on the enterprise side helped keep the numbers moving in the right direction. Chambers said that for Cisco, the UK was a prime example of how the firm’s global transition to selling architectures, solutions and business outcomes would pan out.
This transformation was reflected in other ways, with Cisco’s datacentre business strong and growing at over 30% year-on-year, and its unified compute segment (UCS) reaching a worldwide run-rate of over $3bn, with 36,500 customers in hand.
On software-defined networking (SDN) , Cisco said its application-centric infrastructure portfolio was seeing traction across a number of verticals , including cloud providers, hosted services firms, financial services companies and technology companies.
“You’re going to see us embrace SDN, you’re going to see us implement it for the value that it has. Not only will we lead with this my website implementation, it will allow us to get higher gross margins on our switching and architecture. And we will do it off of an open standard,” said Chambers.

For the original version including any supplementary images or video, visit http://www.computerweekly.com/news/2240226823/Cisco-cuts-jobs-as-profits-fall

Interxion Invests €45m In Marseille Datacentre As Profits Climb

Interxion


Interxion invests €45m in Marseille datacentre as profits climb
News
Tweet
European provider of cloud and carrier-neutral colocation services , Interxion, has bought datacentre facilities in Marseille, France to provide about 5,700m2 of equipped space and 6MW (megawatts) of power to enterprise customers.
The datacentre facility (MRS 1), purchased from SFR, will be built in phases with the first phase of 500m2 opening in the last quarter of 2014 and the second phase opening in 2015.
Interxion has already invested €20m (£15.87m) in acquiring the real estate and for the building of the first two phases with another €25m (£19.8m) investment planned to get the full datacentre ready.
The MRS 1 facility currently serves as a transit and caching node for more than 60 network providers. It has access to the aggregation point of eight undersea cables that terminate in Marseille.
Connectivity between continents
In addition to providing the land, buildings, and datacentre equipment, SFR will provide Interxion with immediate, direct access to the existing community of network providers and cable operators in the area. It will also arrange for the transition of services and transfer most of the space in the facility that it currently uses.
“Interxion’s investment in MRS 1 positions it at the crossroads of connectivity between Europe, Asia, Africa and the Middle East. The strong network hub created by multiple undersea cable landing points connecting to terrestrial cables makes Marseille a highly attractive gateway,” said David Ruberg, Interxion’s chief executive.
The datacentre provider has received strong interest from our connectivity, content delivery network (CDN) , social media, and cloud customers seeking to serve the emerging markets, according to Ruberg.
Datacentre growth
According to Frost & Sullivan, European datacentre services market will see a growth of 16% up to 2018, thanks to the go to website growth in enterprise cloud computing, content-heavy applications and machine-to-machine (M2M) connectivity – which forms the basis for the  Internet of Things (IOT) .
The colocation provider has a total of 37 datacentres across 11 countries in Europe including the UK, Germany, France, Ireland, Sweden, Switzerland, Netherlands and Spain.
At the end of the second quarter of 2014, Interxion’s equipped space totalled 86,000m2 with a utilisation rate of 75%. In the same quarter, Interxion reported a 26% year-on-year jump in its net profit to €8.3m. Its revenue for the second quarter of 2014 at €83.6m was 9% more than the same period last year and 4% more than the first quarter of 2014. 
Interxion aims to create cloud and CDN hubs in the region with its new French datacentre. The purchase agreement with SFR will complete in the third quarter of 2014.

For the original version including any supplementary images or video, visit http://www.computerweekly.com/news/2240226391/Interxion-invests-45m-in-Marseille-datacentre-as-profits-climb

Ucas Uses Splunk To Get Real-time Data On A-level Results Day

Peter_Raymond-UCAS.jpgnullUCAS uses Splunk to get real-time data on A-level results day
Feature
Tweet
Thursday 14 August is D-day for the Universities and Colleges Admissions Service (UCAS), the organisation responsible for managing university and college applications. That is the day when the A-level results come read more… out, and UCAS, its IT team and the broader group of suppliers who deliver its technology, spring into life. 
Another crunch day for the organisation came on 5 August, when Scotland’s Higher results were announced.
Peter Raymond (pictured), enterprise IT architect at UCAS, says the work of the Joint Operations Centre (JOC) at results time benefits from an operational intelligence capability vouchsafed by machine data indexer Splunk.
The JOC comprises about 20 people, including the organisation’s IT director, systems operations staff, architects and workers from suppliers such as Amazon, Microsoft and Oracle.
UCAS has been using Splunk to monitor its IT infrastructure following a partial migration to the cloud, when it found it needed to aggregate device logs and intelligently search them across multiple servers.
The deployment won a Computer Weekly enterprise software user award earlier this summer.
The technology, which indexes and renders searchable machine data , enables UCAS to troubleshoot, manage performance and use analytics to support the IT team. This means students can access information quickly and easily on Track, UCAS’s online application portal.
The organisation began using the technology in July 2013. “It was a big success for us in how we could visualise user experience response times and the sheer volume of transactions,” says Raymond.
For more on UCAS
UCAS adopts public cloud to process university admissions
UCAS deploys Splunk Enterprise across 40 servers and about 70 log sources, all of which are deployed through Amazon Web Services, and everything is forwarded to a Splunk server for indexing.
By indexing, searching, alerting and reporting on machine data from data sources across UCAS’s IT infrastructure, Splunk gives the IT team a series of visualisations of their system performance, key operational metrics (broken down by the Higher Education Institution), their usage, the queries they are running and how the various applications are functioning.
UCAS has 15 dashboards that give real-time monitoring and insight into system load and response times.
At its peak on 15 August last year, the Track system saw more than 180 logins per second. It is hosted on the Microsoft Azure cloud service, while AWS supports more than 350 higher education providers.
The deployment of Splunk Enterprise has enabled UCAS to provide a consistent approach to log collection and retention and expose the data in a searchable form. Logs were previously available by accessing each server, which required system administrator time and did not allow related events on different systems to be found easily.
Raymond says: “The Splunk logs give us both a view of the customer experience and also operationally for us. For example, there is one integration between Azure and AWS and we monitor that integration point with Splunk.
“Instead of an error being tucked away in a log file, we can now get automatic notifications.”
The UCAS site in Cheltenham has 10 large screens, two of which are devoted to Splunk. On those two screens are 10 dashboards, created through queries in Splunk. These include dashboards devoted to tracking and monitoring the user experience of “higher education providers”. One dashboard shows the response time on an applicant enquiry, and another gives response time over a 24-hour period, says Raymond.
“UCAS lives and dies on one day a year,” he adds. “There are TV crews here, [government] ministers walking round. It’s a very visible day.”
Email Alerts
Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.
By submitting my Email address I confirm that I have read and accepted the Terms of Use and Declaration of Consent.
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Read More

For the original version including any supplementary images or video, visit http://www.computerweekly.com/feature/UCAS-uses-Splunk-to-get-real-time-on-A-Level-results-day

Microsoft Must Disclose Data Held In Dublin Datacentre, Rules Us Federal Judge

nullMicrosoft must disclose data held in Dublin datacentre, rules US federal judge
News
Tweet
Privacy and data protection in the cloud suffered a setback on Thursday as the US federal court ruled that Microsoft must comply with the US warrant and hand over customer email data stored in its Dublin cloud datacentre.
District Judge Loretta Preska from the US Court for the Southern District of New York upheld a US magistrate judge’s ruling on Microsoft customer data held overseas .
Microsoft’s general counsel and executive vice-president, Brad Smith, said: “The District Court’s decision would not represent the final step in this process.
“We will appeal promptly and continue to advocate that people’s email deserves strong privacy protection in the US and around the world.”
It is believed that this is the first time an American enterprise is fighting the domestic search warrant for customer data stored outside the US. Other cloud providers including Verizon, Apple and Cisco are all backing Microsoft’s challenge against the court ruling around cloud customer data.
Just a day ahead of the ruling, Smith wrote a column on the Wall Street Journal explaining why Microsoft is opposing the US government’s demand for a customer’s email stored in Dublin, Ireland.
Smith wrote, “This dispute should be important to you if you use email, because it could well turn on who owns your email – you or the company that stores it in the cloud.
More on Microsoft
Campaigners angry over Microsoft, Skype privacy ruling
“Microsoft believes you own emails stored in the cloud, and that they have the same privacy protection as paper letters sent by email.”
The federal judge ruling comes three months after a US magistrate judge ordered Microsoft to give the District Court access to the contents of one of its customer’s emails stored on a server located in Dublin.
At that time, Microsoft challenged the ruling. It said: “The US government doesn’t have the power to search a home in another country, nor should it have the power to search the content of email stored overseas.”
But on Thursday, Judge Preska upheld the ruling saying the physical location of the data is irrelevant.  According to the US Court, law authorises the American government to seek information – including content of an email – by way of subpoena, court order or warrant.
“Microsoft’s argument is simple, perhaps deceptively so,”  Judge Francis had said in an official document in April when Microsoft challenged the ruling.
But Microsoft has argued that, just like a US search warrant in the physical world can only be used to obtain materials that are within the territory of the US, the same rules should apply in the online world. According to the Azure cloud provider, the data privacy provisions in the Electronic Communications Privacy Act (ECPA) do not apply outside of US territory.
The first US warrant for data was issued back in December 2013 when the US judges wanted access to a Microsoft customer’s email data stored in Ireland in connection with a narcotics investigation. Microsoft has continuously challenged the ruling.
Microsoft’s €480m European datacentre in Dublin , catering to its Azure cloud users, opened in 2009.
Microsoft can appeal the district judge’s decision to the second US Circuit Court of Appeals. But http://www.carcaresoftware.org/carcaresoftwareorg the ruling may reinforce the data protection and privacy concerns in cloud services prevalent among European customers.

For the original version including any supplementary images or video, visit http://www.computerweekly.com/news/2240226031/Microsoft-must-hand-over-user-data-held-in-Dublin-datacentre-rules-US-federal-judge

Energy Saving Trust Deploys Esri For Cognos To Support Policymakers

cloudapplications_article_009.jpgnullEnergy Saving Trust deploys Esri for Cognos to support policymakers
Energy Saving Trust deploys Esri for Cognos to support policymakers
News
Tweet
The Energy Saving Trust, which provides impartial advice on carbon reduction, reference has deployed Esri for Cognos to provide instant access to its mapping analytics portal.
The trust will use Esri to provide real-time geospatial information, overlaid with demographic information and building Energy Performance Certificates (EPC) for local government, energy company planners and policy makers.
Will Rivers, data insight manager at EST, said: “Even large energy providers take an old-school approach. Installers need to knock on the door of every home in a street to install home insulation.”
Given that there are more than 27m addresses in UK, Rivers said there is an opportunity to provide the energy companies and government planners with real-time datasets, which can be displayed on a map and analysed spatially to support energy-efficiency programmes.
More articles on GIS
Water firm uses GIS to improve business processes
Rivers said one of the ways Esri is being used is to support policy makers in Scotland. Using an EPC he said the Energy Saving trust is able to build a dataset that models 50 parameters related to energy across every home in Scotland: “Local authority can drill down and look at an individual street or estate – to see what scheme would benefit the area.”
By using this statistical modelling he said it is possible for the EST to infer data related to other houses in a street based on the EPC of other houses in the same area. 
“You end up with a rich data set. We can overlay the prevalence of electrical heating, or average wind speed plus social demographic data for issues like fuel poverty to help government [policy makers] target specific issues,” he said.

For the original version including any supplementary images or video, visit http://www.computerweekly.com/news/2240226534/Energy-Saving-Trust-deploys-Esri-forCognos-to-support-policy-makers

Blackline Promises To End Financial Closing Spreadsheet Nightmare

140609_0595.jpgnullBlackLine promises to end financial closing spreadsheet nightmare
Interview
Tweet
Financial applications are a strategically core technology area for any businesss, and financial closing is a classic business problem, the “ last mile of finance ”. 
BlackLine Systems ’ software as a service product offers the promise of agility beyond spreadsheets, and is an established part of SAP’s ecosystem.
One of its first UK customers was Schroders bank, says Mario Spanicciati, European executive vice president, operations and executive director, at the supplier. 
Its business began to develop in the UK in 2012, and the firm opened an office in Pall Mall, in London in 2013, from where it has expanded into France. It has 250 employees globally. The firm entered and won the supplier of the year award in the Computer Weekly enterprise software awards in June of this year.
Spanicciati describes how the company evolved from beginnings, in 2001 in California, when it was focused on wealth management software. But, as a result automotive care software of an engagement with First National Nebraska, they switched, in 2004, to account reconciliation as a speciality. The software suite now has six modules.
“Basically, we help companies to automate manual financial and accounting processes.” he says.
Many companies, says Spanicciati, do their account reconciliations manually, at the end of every month using Excel spreadsheets : “We’re talking thousands to hundreds of thousands of spreadsheets”.
BlackLine promises to end financial closing spreadsheet nightmare
Four tips for faster financial consolidation
This is the problem that BlackLine Systems, among others, addresses. The supplier’s software, which is supplied as a service, is said to speed up the process and cut costs, sometimes as much as 50% or more, among its customers.
“The cloud delivery mechanism makes changing from spreadsheets easier”, he says. “And it helps to centralise the information. We have customers who have between one and hundreds of ERP systems globally. And it makes it easier for our customers’ auditors.
“From an innovation perspective we have been first to market in all six of our modules, and everything has been built on the same platform. And I’d say ease of use has been a big differentiator for us”.
Competitors include Trintech and Chesapeake System Solutions .
BlackLine’s Financial Close Suite for SAP is one of 40 software packages that have been endorsed by SAP. Spanicciati stresses that it is “ERP-agnostic”, and also works with Oracle, and Oracle’s acquired products, PeopleSoft and JD Edwards.
The supplier’s European customers include Boeing, Eurostar, and Schroders. It has recently signed a deal with publisher Reed Elsevier.
“I know everyone says this, but we really do want our customers to be successful with the product. And we’ve had 50% growth year on year”, says Spanicciati.
Email Alerts
Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.
By submitting my Email address I confirm that I have read and accepted the Terms of Use and Declaration of Consent.
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Read More

For the original version including any supplementary images or video, visit http://www.computerweekly.com/feature/BlackLine-promises-to-end-financial-closing-spreadsheet-nightmare

How To Tame The New It Beast Called Devops

Mission control at Verizon TerremarknullHow to tame the new IT beast called DevOps
Feature
How to tame the new IT beast called DevOps
Clive Longbottom
Tweet
There is a confluence of multiple different technology changes that are heavily influencing how enterprise IT must work. Virtualisation, cloud, software-defined everything, big data, everything as a service – are all forcing IT to change and look at a new beast called DevOps .
The majority of changes to the IT landscape are aimed at speed – speed of response of applications for users; speed in the ability to change functions, applications and processes to better respond to the organisation’s needs.
The old ways of doing IT are forced to change. A traditional IT project plan follows a long timescale – developing solid project management criteria with user definition documents, project plans, reviews, change management processes and so on and is planned over an 18-month timescale. 
The problem here is that solving a problem 18 months out is not solving the real issue, which may well have disappeared by then and been replaced by half a dozen new ones.
To the rescue rides DevOps – essentially, bringing the development and operational teams closer together through automation to support the speed of change the business requires.
DevOps  is an IT practice of merging the tasks performed by a company’s application development and those performed by the systems operations teams. Software-defined infrastructure and cloud computing call for enterprises to tear down the silos between the development and operation teams.
Historically, the development team has tended to work as a self-contained unit, working away on its projects against the various definition documents until they have created what they see as a completed application. It then goes through a test phase – generally still under the nominal auspices of development – after which the operations team takes control of the application and rolls it out to the masses.
Read more about DevOps
Reshaping IT organizations to fulfill a DevOps strategy
If, as often happened, the new application had a few teething troubles, development would get a list of these from the help desk, would put them through a fairly rudimentary prioritisation – “Does this look interesting?” – and make the changes it thought necessary within its walled environment, submitting the next iteration of the application for testing and roll out as and when.
If, however, there were more than “teething troubles”, all hell would break out. Applications that had major troubles in the operations environment would need to be rolled back and the underlying data re-synchronised to the old application so that employees, partners and customers could work again. The development team would be presented with what the problems were and would be scratching their heads as it had all worked in their environment. 
Working in isolation leads to frustration and inefficiencies, as each team does not understand the limitations or challenges of the other. But more crucially, IT and the business are both badly affected.
Technology focused companies such as Facebook, Yahoo, Yammer, Amazon, Google and VMware have all embraced DevOps.
How DevOps solves the silo issue
Development staff have to be more closely involved with operations staff – and with the entire operations environment.  Development should be able to be carried out against real data sets wherever possible, so that any issues can be found at the earliest possible time.  
The key to DevOps is in automating as much as possible – both in how the process of moving a development project through testing into run time occurs, and also in how any operational problems are dealt with.
Projects have to be broken down into smaller chunks – required functionality has to be strictly prioritised and the development team has to work to get a functional system out into the operations environment rapidly to meet the highest priority issues. “Rapidly” should be measured in weeks – a rule of thumb should be that first roll out must be within 12-18 weeks.
Read more on datacentre management
Storage, business growth and virtualisation driving datacentre expansion
This drives the need to move from monolithic applications to composite ones built by pulling together functions from various sources. Existing functionality from within the datacentre should be combined with cloud apps and services , so that the wheel is not continually re-invented, using domain expertise and data sets that add value to the overall composite app where applicable.
Testing should be carried out either as a parallel implementation in the run-time environment, so that more performance issues can be picked up at the earliest possible stage, or in a pseudo-live environment with a set of “tame” testers from the actual users. 
For the parallel example, this should be relatively easy through the use of virtualisation. Copies of existing databases can be spun up against virtual machines running the DevOps code. Testing can then be carried out against the existing network loads so that performance is evaluated at the same time as the efficacy of the code. 
Ensure that exceptions can be rapidly and effectively handled.  The testing phase is where people power really counts – they can perceive patterns of usage and problems in how people are using the application far more effectively than computers can. Throw a lot of people at the testing phase – monitor and measure everything that you can.
Once the testing phase is done, it should then be a case of just switching the application over to the live database and the users over to the new code.
Remember that DevOps is, by its very nature, an iterative process. Getting a 60%, 70% or 80% solution out means that a further 40%, 30% or 20% is still required. The aim should always be to solve a prioritised proportion of what is left, plus whatever else has come up as new requirements since the last code was rolled out.
How to make your DevOps  strategy fruitful
As much as possible needs to be automated – as code elements are completed by the development team, integrated and fully audited capabilities to move these elements through from development to test and then to the run-time environment will help ensure that incremental changes and improvements can be made.  
Roll-back can also be automated – although a well-run DevOps programme should not require roll-back to be carried out, it should always be available as a Plan B. Being able to get back to a known position – comparable to a backup and restore recovery time and recovery point objective (RTO/RPO) – should be part of any DevOps setup.
DevOps is an excellent approach to speeding up how an IT department can ensure that it is meeting the needs of the business. However, a badly implemented DevOps approach will lead to more errors and users being unhappy
Feedback also needs to be automated. There is no point in waiting for the help desk to aggregate comments and feedback and report that through to the DevOps team. Capture feedback as part of the application; ask for feedback as part of the process. Act on the feedback according to how much it is impacting the business – not how interesting the problem sounds from a technical level.
Overall, DevOps is an excellent approach to speeding up how an IT department can ensure that it is meeting the needs of the business. However, a badly implemented DevOps approach will lead to more errors and users being unhappy with the experience they receive. 
Implementing DevOps strategy is not easy because it requires enterprises to embrace a new culture. DevOps is culture , process, technology and people. 
It requires deep cultural and organisational change and that visit this site involves changing behaviour – a lot. It means throwing out decades of embedded explicit and implicit practices. It involves telling the veterans accustomed to running things that much of what they know and do every day is obsolete, and that is hard.
Just putting development and operations team together in one room will not lead to a successful DevOps strategy – each team must understand and appreciate the importance of working together in the cloud and software-defined era.
Automating, incentivising development and operations team to work together, allocating time for training employees, developing new enterprise architecture standards around build-to-operate principles and aligning IT with business objectives are all steps to ensure a DevOps strategy is not undermined.
Ensuring that a few simple areas are covered off will help to make DevOps work well .
Email Alerts
Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.
By submitting my Email address I confirm that I have read and accepted the Terms of Use and Declaration of Consent.
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Read More

For the original version including any supplementary images or video, visit http://www.computerweekly.com/feature/How-to-tame-the-new-IT-beast-called-DevOps

In-memory Database Technology Puts Mitie And Wooga In The Fast Lane

140505_0488.jpg


In-memory database technology puts Mitie and Wooga in the fast lane
Case study
In-memory database technology puts Mitie and Wooga in the fast lane
Linday Clark
Tweet
It takes years to train top accountants. But how much time do they spend waiting for data to load? This is a question Mitie , a £2bn business services and outsourcing firm, has been asking itself.
By looking at the time it was taking to analyse company finances using spreadsheets and other reporting tools, the firm reasoned it would be worth investing in a better approach, says Edward Callaghan, Mitie’s finance director for London and car maintenance software South-East England.
As a result, the company is at the proof of concept stage in developing a reporting system based on IBM Cognos running on a DB2 Blu in-memory database taking data from Oracle Financials.
These systems store the entire database structure in-memory and access the database directly, without the use of input/output instructions, improving application performance. 
Although Callaghan would not reveal the proposed investment, it could be cheap compared with accountants’ time, some of which is currently spent extracting reports from Mitie’s Oracle Financials system.
“You can look at the number of extracted reports out of Oracle and the duration each takes to download and measure the total time from raw data to final output in our current approach. We’re looking to release that time into doing more valuable work,” he says.
Read more on uses of in-memory database technology
Lush cleans up on data with QlikView
Phase two of the project proposes to link Oracle Project Accounting to the in-memory database. Contract managers currently produce analysis for meetings, but will not be able to answer follow-up questions until sometime afterwards.
In the south-east, Mitie manages about 140,000 jobs per year, while each manager could look after 100 contracts. Using in-memory analytics hosted on IBM Power 8 servers, Callaghan says project managers could answer queries during meetings making them more productive and reducing the time it takes to make important decisions.
Mitie also plans to exploit in-memory technology in asset management and capture information from mobile devices on site, increasing the data available for analysis to make better management decisions on the fly. But the first project focuses on improving financial transparency and is set to start in the next quarter.
Mitie provides an example of moving to in-memory database technology to improve data exploration. Such tools come from companies like Qlikview, Tableau, Teradata, MicroStrategy and Exasol among others. 
Case study: Developer ups its game with Exasol
In mobile and social gaming, players are fickle creatures. Too easy, and they get bored and move to another game. Too hard, and they give up and do the same. For developers, keeping players engaged is a constant challenge requiring continual improvements to each game.
For Berlin startup Wooga this means structuring the business around each game so designs can rapidly respond to behaviour of the 35 million users who log in to its games each month. The firm collects about 200GB of data per day from its players, in the form of game events codified in HTTP requests.
Founded in 2009, the business built its systems around open source software and used MySQL for analytics. But it soon reached the limitations of these technologies.
Markus Steinkamp, Wooga’s head of business intelligence (BI), says: “You came in in the morning, put some queries in SQL and after lunch got the results. That is not the way to push analytics in this company.”
Simple calculations, such as finding the average game time, could be grindingly onerous. Calculating the mean is easy, but not useful, because it can be skewed by people who log out after a few seconds. The median is a more powerful average, but is painful to calculate without in-memory technology, Steinkamp says. 
“You need the whole results set to sort the data and find the median. You have to put the whole result somewhere. It is better to do it in memory; otherwise you have to swap the results on to disk and re-read it – it is too painful.”
Wooga deployed in-memory database and analytics tools from Exasol, hosted in the cloud in the vendor’s datacentre.
The game developer eschewed the opportunity to buy the technology from larger suppliers, partly for technical and partly for cultural reasons.
“Buying a solution is quite rare at Wooga. But Exasol is quite a small company – we can talk to them and we know them. A solution from IBM, SAP or Oracle is completely different – you have to talk to the sales organisation and the prices are far higher. We talked to SAP and we were able to use in-memory tool Hana for a year at no cost but the whole culture was too different,” says Steinkamp.
While a central BI unit provides technology support, in-memory technology allows analytics to be devolved to where decisions about game modification are made, Steinkamp says.
“The games teams have every function – engineers, product managers, game design, data analysts and art people. They have weekly meetings for new features and the analyst is quite important in that. They have to say, ‘This feature has the potential to increase retention by X,’ but the product lead might ask questions and they need the answer the same day, not three days later. Data analysts are hands on and independent. If you have a central team then you have dependencies and we don’t like that.”
As well as providing benefits in traditional businesses, startup businesses such as online gaming firm Wooga are using in-memory data exploration to bring data analysts closer to business teams (see case study, left), improving the speed and efficiency of decision making.
The early history of in-memory
This is a far cry from early applications of in-memory technology. Martha Bennett, principal analyst with Forrester Research, recalls this was in financial services, where algorithms could be applied to live trading data in real time . 
In these markets, which have used such technology since the mid-2000s, even small improvements in the performance of financial trades could justify the costs of holding large amounts of data in solid state storage.
These technologies, which allow pre-programmed algorithms to make decisions on live data, are now being deployed more widely in other applications where speed is important. 
In e-commerce, they can help with dynamic pricing, while in online advertising they can assist ad placement. This application-specific approach is also extended to making decisions on live data in ERP or supply chain systems, as suppliers such as SAP and Oracle adopt the technology.
Bennett says the performance of in-memory is now providing benefits for a wide range of businesses and use cases: “For some businesses it is very important. It sounds like a cliché, but business is moving faster and competitive advantage does lies in your ability to do more with data more quickly.”
But users need to get to grips with data basics before they can begin to exploit these new technologies, she says.
“A lot of organisations really don’t have a good handle on their data. They have issues with who owns the data, they don’t know what rules to apply or how long it is kept. The companies that are moving ahead [with in-memory] are the ones where the business not only understands the data but is also prepared to take ownership of it. That is a cultural thing.”
Another pitfall in the application of in-memory technology comes when applying algorithmic decision-making on live data, Bennett says.
“You need to understand what they are doing and you need to understand your business. You need to watch what is going on because an algorithm deteriorates. Do you know what it is going to do? How are you going maintain it? A lot companies do not understand this,” she says.
With the application of algorithms to in-memory data, businesses are in danger of “getting the train wreck faster” but there is also the opportunity to experiment in shorter cycles and discover which algorithms work best, if done in a controlled manner, Bennett says.  
In-memory analytics vs the traditional data warehouse
Gartner forecasts widespread adoption of in-memory database technology in the coming years, not only speeding up existing applications but also supporting new business opportunities.
But Gartner research director for information management, Roxane Edjlali, says in-memory technology will not replace the data warehouse by performing all analytics as part of the application stack.
“You can do transactional analytics in-memory but this does not remove the need for the data warehouse in the next few years. Firstly, managing all this data in-memory can be expensive, and may not make sense. Not all business applications will be candidates to move in-memory. You are likely to see a mixed environment,” she says.
“Secondly, you bring semantic consistency and cleanse data by moving it to a data warehouse. This process needs to happen somewhere. The result of having multiple apps running analytics is you do not have consistent data between them.”
In some cases this might make sense, but organisations will still need to achieve consistency of their data at some point, she says.
The benefit of in-memory databases is speed. Some applications need it and some do not. But the technology is also allowing businesses to change the way they work and put analytics closer to the decision-making coalface.
Email Alerts
Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.
By submitting my Email address I confirm that I have read and accepted the Terms of Use and Declaration of Consent.
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Read More

For the original version including any supplementary images or video, visit http://www.computerweekly.com/feature/In-memory-database-technology-puts-Mitie-and-Wooga-in-the-fast-lane

Vince Cable Welcomes £135m Datacentre Investment In Docklands

Canary-Wharf-290x230.jpg


Vince Cable welcomes £135m datacentre investment in Docklands
News
Tweet
Telehouse, the London-based colocation and datacentre provider, is building a next-generation mission-critical datacentre in http://www.carcaresoftware.org/carcaresoftwareorg the Docklands, after Japanese parent company KDDI decided to pump in £135m to expand the existing Docklands infrastructure.
“It is good news that Japanese ICT company KDDI has decided to invest a further £135m in their global datacentre at the Docklands,” said Vince Cable, UK secretary of state for business, innovation and skills. “Britain’s economy is growing thanks to Japanese investment and in 2013/14, it was the second biggest investor in the UK, starting over 100 new projects and creating 3,000 new jobs.”
Over the past 25 years, KDDI has invested over £272m, with £172m over the past five years alone.
The investment, in line with the UK government’s policy of inward investment to encourage innovation, growth and new jobs , will enable businesses to take advantage of evolving technologies, according to the government.
The new datacentre facility, built next to the existing facility , will be called Docklands North Two and will bring 23,134m2 of floor space, taking Telehouse’s total presence in London to 73,395m2.
North Two will feature improved network connectivity to provide British businesses with the digital infrastructure needed to take advantage of the “new internet”, driven largely by apps, smartphones, tablets and cloud computing.
According to Telehouse, 78% of all UK business has at least one cloud-based service and the use of mobile devices is surging rapidly.
“North Two has been created with the new internet in mind. It will house mission-critical infrastructure, which will enable the development of hybrid services for customers,” said Hiroyuki Soshi, managing director of Telehouse Europe.
“The £135m investment in North Two will be crucial to the expansion of one of the world’s most critical internet hubs and marks a new dawn for datacentres.
“We believe that as the internet continues to develop at such a dramatic pace, the underlying infrastructure in the Docklands must stay ahead to meet the needs of the future,” Soshi said.
Read more about Telehouse and datacentres
Modular datacentres yield 30% more savings than traditional datacentres
Telehouse’s existing Docklands facility serves many telecommunication carriers, such as London Internet Exchange (Linx), Colo-X and Claranet.
Further capacity at Docklands datacentre is important to ensure this vital network ecosystem can continue to expand and thrive to service the needs of the UK and Europe’s digital economy, said Tim Anker, founder and director of Colo-X.
“The Telehouse Docklands dataentre already facilitates the majority of Linx capacity, and internet traffic will inevitably grow, especially with the transition to 100G technology now underway, said John Souter, chief executive of Linx. 
“With such a huge proportion of all UK internet traffic flowing through Linx, this investment in the national critical infrastructure of the UK gives us great confidence.”
North Two will house mission-critical infrastructure, which will enable the development of hybrid services for customers, the parent company has said.
Alastair Pooley, vice-president for datacentres at security company Sophos said: “Sophos relies on a mixture of cloud and physical datacentres to provide service to our customers. We believe cloud is key to our success but underpinning it are strong facilities where we can host our critical data on our own hardware platforms.”

For the original version including any supplementary images or video, visit http://www.computerweekly.com/news/2240225279/Business-Secretary-Vince-Cable-welcomes-135m-datacentre-investment-in-Docklands

Smart Meters, Digital Channels And Crm Are Top Priorities For British Gas

britishgas.jpgnullSmart meters, digital channels and CRM are top priorities for British Gas
Smart meters, digital channels and CRM are top priorities for British Gas
News
Tweet
Smart meters, digital channels and reducing customer complaints by implementing a new CRM system, are top of the IT agenda for British Gas over the coming year.
The company highlighted a series of technology aims and ambitions for the next 12 months in its financial results published today.
The smart connected home
The smart connected home is key to the future growth of British Gas , with plans to target 2.4 million residential smart meter installations by the end of next year.
The company said it has already installed one million residential smart meters, with plans to roll out 1.3 million by the end of the year, but only 350,000 customers are so far receiving smart energy reports.
“We strongly support the 2020 mandate for full smart roll-out and are on track to support the ‘go live’ of the Data Communications Company in December next year and to lead industry testing of the new systems in mid-2015. We encourage the industry, government and regulatory bodies to maintain momentum on all fronts to ensure the smart roll-out is delivered on schedule,” said the British Gas results statement.
The company, which is owned by Centrica, said smart meters will bring an end to estimated bills, and greater ability to monitor and reduce consumption while making it easier and faster to switch between suppliers.
Additionally the firm has sold 100,000 smart thermostats to date, and increased sales of the Hive brand, which provides remote heating control and was launched almost a year ago.
It is also trialing a smart meter “Free Saturdays or Sundays” energy tariff, which it hopes will be available widely from next year.
Read more on British Gas:
Centrica replaces legacy power station systems with SAP
British Gas hopes that providing added-extra technologies in the home will make its service address management software more compelling.
“Apps and the internet have transformed home entertainment, but they have not had much impact on the rest of the home,” said Andrew Brem managing director of commercial & product development at the Connected Homes business of British Gas , in an interview with Computer Weekly in September last year. 
“People seem to lead unpredictable lives, but unlike much of life, the home is totally fixed and not related to how they run their life.”
But this is changing. For instance, paying for parking via a smartphone has many benefits, he said.
Through Connected Homes, British Gas provides a service and app to switch the heating on and off remotely.
According to Brem, 40% of users interact with the remote heating control app at least once a day. “Almost 25% of gas usage is wasted when either you are not at home or when you are asleep. Remote heating control lets you control gas consumption,” he said.
Digital channels
Like any customer-facing business, British Gas has the challenge of dealing with consumers online. The company said that around two-thirds of its customer interactions are now made through digital channels, with around half of those coming from a mobile or tablet device.
The energy supplier also pointed out that the number of residential sales coming through digital channels has nearly trebled in the first half of this year compared to last, while 1.3 million customers have downloaded its mobile application.
Customer relationship management
The company is in the process of implementing a new billing system which will simplify bills for customers and enable British Gas to deliver improved service and lower cost.
The new customer relationship management (CRM) platform will simplify direct debit payments and the process customers have to go through when moving home. British Gas intends to complete the migration of residential customers to the new single billing and CRM platform in the autumn.
“We are targeting a significant reduction in customer complaints over the next three years,” said British Gas, which wants to create a more integrated customer experience using the new CRM platform.
In the past , British Gas has had problems with CRM. The company has worked hard to repair a damaged reputation since deteriorating customer service standards caused the defection of approximately a million customers in 2006. Between 2008 and 2010 the firm moved from having the least satisfied customers to having the most favourable ratings, according to a Morgan Stanley survey.
In the last couple of years it has rolled out an SAP CRM system , which replaced a myriad of systems, including Siebel databases and bespoke or highly-modified software, and was intended to simplify the processes for callcentre agents in handling the queries of millions of UK customers.
With its previous CRM system, British Gas employees had to process 20-30 screens of information, but the SAP system reduced this to “a couple” instead.
“The process of moving home and changing suppliers is very complicated, there are many complexities that make up the UK system – it is not a British Gas process – but you have to deal with all that,” said British Gas CIO David Cooper in an interview with Computer Weekly in 2012.

For the original version including any supplementary images or video, visit http://www.computerweekly.com/news/2240225930/Smart-meters-digital-channels-and-CRM-are-top-priorities-for-British-Gas