Tag: cloud computing

The recent Telstra security breaches–how were they handled?

Through this last year, there has been an increasing number of incidents where customers of high-profile companies have had their identifying data compromised. One of these incidents that put everyone in the IT world “on notice”, especially those involved in consuner-facing IT like ISPs or online services, was the Sony PlayStation Network / Qriocity break-in by LulzSec / Anonymous.

Close to that, I had attended a presentation and interview concerning the security of public computing services hosted by Alastair MacGibbon and Brahman Thiyagalingham from SAI Global, the report which you can see here.

The BigPond incident

Over the last weekend, Telstra had suffered a security breach that compromised the user details of some of their BigPond Internet-service customer base. This was through a customer-service search Webpage being exposed to the public Internet rather than Telstra’s own customer-service network.

The privacy compromise was discovered on Friday 9 December 2011 (AEDT) and mentioned on the Whirlpool forum site. It was in the form of an in-house “bundles” search page exposed to the Web with the database containing usernames, passwords and fully-qualified email addresses of a large number of the customer base at risk.

Telstra’s response

But Telstra had responded very quickly by locking down the BigPond customer email servers and Web-based self-service front-ends while they investigated the security compromise. The customers whose data was exposed had their passwords reset with them being required to call the BigPond telephone support hotline as part of the process.

As I do maintain an email account through this service for a long time, I had taken steps to change the password on this account. This was even though I wasn’t one of the customers that was subject to the aforementioned mandatory password reset.

Telstra also maintained a live channel of communication to its customers through their own Web sites, through updates to the main media channels and through an always-running Twitter feed. Once the email system was open for business, a follow-up email broadcast was sent to all BigPond customers about what happened.

My comments on how this was handled

Like the Sony PlayStation incident, this incident was one that affected a high-profile long-established brand which, like other incumbent telecommunications-service providers, was in a position where the brand has a bittersweet connotation. Here the brand is associated with a portfolio of highly-established high-quality stable telecommunications services but has had negative associations with poor customer service and expensive telecommunications services.

What I saw of this was that after the Sony incident and similar incidents against other key brands, the IT divisions for Telstra haven’t taken any chances with the data representing their customer base. They had quickly locked down the affected services and forced the necessary password-reset procedures in order to reduce further risks to the customers; as well as keeping customers and the public in the loop through their media, Web and Social-Web channels.

The Telstra incident also emphasised the fact that the risks can come from within an affected organisation, whether through acts of carelessness or, at worst, deliberate treacherous behaviour by staff. As I have said in the previously-mentioned interview and conference article, there needs to be data protection legislation and procedures in place in Australia so that a proper response can occur when these kinds of incidents occur.

How can the Occupy campaigns and cloud computing help the small or midsize business

Article

HP Blogs – How can Occupy Wall Street and Cloud Computing hel… – The HP Blog Hub

My Comments

The recent “Occupy” movements, which were assisted by the Social Web to create the critical mass, had an intended effort to highlight the resource disparity caused by big business to ordinary people, and small and midsize businesses.

This occurred at the same time that consumers and small-to-medium business were made heavily aware of the concept of “cloud-computing” and computing-as-a-service. In some ways, this can assist in making certain computing services that would be out of the reach of the 99% accessible to this group rather than the 1% which represents the “big end of town”.

When I visited the “Big Picture Experience” computing conference that was hosted by Microsoft in Melbourne this past Wednesday (AEDT), there was a lot of emphasis on this kind of cloud-computing and computing-as-a-service to effectively make a flexible workforce. Applications that were promoted included shared-document management and unified communications; with these applications linking to the business via Internet connections.

They even proposed that small and medium business who can’t afford their own servers have this functionality by renting these services from other companies in a similar way that we can rent disk space for our Web sites from Web-hosting companies like GoDaddy. It is also in a similar way to how some small business operators can work out of a garage yet are able to rent a self-storage lockup from Fort Knox or Big Yellow for storage of extra goods or hire a competent truck form Budget or U-Haul when they need extra trucks.

These concepts can open the door to the feasibility of smaller operations expanding without costing them an arm and a leg. It is because it could allow concepts like telecommuting or shared-desk business, which could lead to reducing the physical size of the business’s premises.

Cloud computing and computing-as-a-service can open up “big-business” paths to smaller operations. Examples of this may include hosted archiving-for-compliance or access to sophisticated business systems and practices like multi-tier loyalty programs for independent business.

This kind of computing can then become the big tide that lifts many boats up and yield flexibility across business sizes. In some ways, it could allow “big-business” hopes from small and medium business owners.

What is this private cloud functionality being touted with NAS devices?

Netgear ReadyNAS - the heart of the personal cloud

The NAS as the heart of the personal cloud

I am seeing increasing reference to the “cloud” concept in marketing literature for consumer and small-business network-attached storage devices by their vendors. It is typically talked of in the concept of a “personal cloud” surrounding the NAS device and is used across the product range.

Examples of this include Western Digital’s My Book Live NAS, PogoPlug USB file servers and Iomega’s “Cloud Edition” NAS range.

What it is about

This feature is primarily about an easy-to-establish remote-access system for the NAS device so you can gain access to the files on this device from the Internet. The manufacturers tout this as an alternative to storing data on public-cloud file-storage services like Dropbox, iCloud, Windows SkyDrive or setting up private FTP or HTTP access to the data-storage facility your ISP or Web host may provide.

It is based on the NAS having vendor-supplied software to link with a cloud-based service that makes it easy to locate on the Internet even if you use a regular dynamic-IP Internet service. The vendor may supply desktop and mobile software to facilitate this discovery and / or establish a user-subdomain or directory name that is part of their “remote-access” service domain.

Of course, your data still resides on the NAS with the vendor’s service cloud being the Internet-side discovery link for the device. As well, all of these personal clouds use encryption of a similar standard to what is used to secure your Internet-banking session.

This idea has been existing for over the last few years with vendors providing their simplified remote-access solutions for their NAS products but they are using the current emphasis on cloud-computing technology as a marketing tool for this functionality. This is in a similar vein to how online services have been marketed using the cloud term even though they use this concept.

How can it be taken further

Currently this cloud concept is being exploited further with smartphones and tablets by the NAS vendors providing free data-access apps on their platforms’ app stores. Here the apps allow the users to use the mobile device’s user interface to transfer the desired data between the NAS and the device’s local storage. Some of us would see it as a way to offload picture data from the smartphone to the DLNA-enabled NAS or pull down important data to the smartphone or tablet.

Netgear is even working with Skifta to provide remote access to media content on its ReadyNAS units and allow a PC or Android phone to share the content from the remote ReadyMAS device with DLNA-compliant AV equipment.

The Iomega solution is implementing the Personal Cloud concept as a backup and peer-to-peer replication setup; as well as a remote-access method. But as more manufacturers get on the bandwagon, there may be the issue of providing a vendor-independent “personal cloud” in order to encourage competition and innovation.

What should my network have

The network has to have a router that is set up for UPnP IGD functionality at its network-Internet “edge” for the cloud-based remote access to run properly. This will apply to most retail and ISP-supplied routers, but you may have to make sure this function is properly enabled.

You don’t need to have a fixed IP address or a “DynDNS” program running on your equipment to have this personal cloud operate because the vendor-supplied software on the NAS takes care of the location and access function. But it should have a reliable Internet connection and you may want to put the NAS and network-Internet “edge” equipment on a uninterruptable power supply to assure high availability even with rough power supply conditions. It may be worth reading this article that I wrote about keeping “sanity” on your home network during periods of power unreliability if you want to keep that personal cloud alive.

But avoid the temptation to use a Wi-Fi wireless connection to connect a NAS to your router, even if the NAS does have Wi-Fi connectivity. Instead, connect it to your router with an Ethernet cable, so you have reliable operation.

Conclusion

In the context of the consumer or small-business network-attached storage system, the “cloud” feature is simply being used as a way to describe a simplified remote-access environment for these devices.

Interview and Presentation–Security Issues associated with cloud-based computing

Introduction

Alastair MacGibbon - Centre For Internet Safety (University of Canberra)

Alastair MacGibbon – Centre For Internet Safety (University of Camberra)

I have been invited to do an interview with Alastair MacGibbon of Centre For Internet Safety (University Of Canberra) and Brahman Thiyagalingham of SAI Global who is involved in auditing computing service providers for data security compliance.

This interview and the presentation delivered by Alastair which I attended subsequently is about the issue of data security in the cloud-driven “computing-as-a-service” world of information technology.

Cloud based computing

We often hear the term “cloud computing” being used to describe newer outsourced computing setups, especially those which use multiple data centers and servers. But, for the context of this interview, we use this term to cover all “computing-as-a-service” models that are in place.

Brahman Thyagalingham - SAI Global

Brahman Thyagalingham – SAI Global

These “cloud-based computing” setups are in use by every consumer and business owner or manager as they go through their online and offline lives. Examples of these include client-based and Web-based email services, the Social Web (Facebook, Twitter, etc), photo-sharing services and online-gaming services. But it also encompasses systems that are part of our everyday lives like payment for goods and services; the use of public transport including air travel; as well as private and public medical services.

This is an increasing trend as an increasing number of companies offer information solutions for our work or play life that are dependent on some form of “computing-as-a-service” backend. It also encompasses building control, security and energy management; as well as telehealth with these services offered through the use of outsourced backend servers.

Factors concerning cloud-based computing and data security

Risks to data

There are many risks that can affect data in cloud-based computing and other “computing-as-a-service” setups.

Data theft

The most obvious and highly-publicised risk is threats to data security. This can come in the form of the computing infrastructure being hacked including malware attacks on client or other computers in the infrastructure to social-engineering attacks on the service’s participants.

A clear example of this were the recent attacks on Sony’s online gaming systems like the PlayStation Network. Here, there was a successful break-in in April which caused Sony to shut down the PlayStation Network and Qriocity for a month. Then, a break-in attempt on many of the PlayStation Network accounts had taken place this week ending 13 October 2011.

The attack on data isn’t just by lonely script kiddies anymore. It is being performed by organised crime; competitors engaging in industrial espionage and nation states engaging in economic or political espionage. The data that is being stolen is identities of end-users; personal and business financial data; and business intellectual property like customer information, the “secret sauce” and details about the brand and image.

Other risks

Other situations can occur that compromise the integrity of the data, For example, a computing service provider could become insolvent or change ownership. This can affect the continuity of the computing service and the availability of the data on the systems. It also can affect who owns the actual data held in these systems.

Another situation can occur if there is a system or network breakdown or drop in performance. This may be caused by a security breach; but can be caused by ageing hardware and software or, as I have seen more recently, an oversubscribed service where there is more demand than the service can handle. I have mentioned this latest scenario in HomeNetworking01.info in relation to Web-based email providers like Gmail becoming oversubscribed and performing too slowly for their users.

Common rhetoric delivered to end-users of computing services

The industry focuses the responsibility of data security for these services on to the end-users of the services.

Typically the mantra is to keep software on end computers (including firmware on dedicated devices) up-to-date; develop good password habits by using strong passwords that are regularly changed and not visible to others; and make backup copies of the data.

New trends brought on by the Social Web

But there are factors that are being undone by the use of the Social Web. One is the use of password-reset questions and procedures that are based on factors known to the end user. Here, the factors can be disclosed by crawling data left available on social-networking sites, blogs and similar services.

Similarly, consumer sites like forums, and comment trees are implementing single-sign-on setups that use credential pools hosted by other services popular to consumers; namely Google, Facebook and Windows Live. This also extends to “account-tying” by popular services so that you are logged on to one service if you are logged on to another. These can create a weaker security environment and aren’t valued by companies like banks which hold high-stakes data.

The new direction

As well, it has been previously very easy for a service provider to absolve themselves of the responsibility they have to their users and the data they create. This has been through the use of complex legalese in their service agreements that users have to assent to before they sign up to the service.

Now the weight for data security is now being placed primarily on the service providers who offer these services to the end users rather than the end users themselves. Even if the service provider is providing technology to facilitate another organisation’s operations, they will have to be responsible for that organisation’s data and the data stream created by the organisation’s customers.

Handling a data break-in or similar incident

Common procedures taken by service providers

A typical procedure in handling a compromised user account is that the account is locked down by the service provider. The user is then forced to set a new password for that account. In the case of banking and other cards that are compromised, the compromised account cards would be voided sot that retailers or ATMs seize them and the customer would be issued with a new card and have to determine a new PIN.

The question that was raised in the interview and presentation today is what was placed at risk during the recent Sony break-ins. The typical report was that the customers’ login credentials were compromised, with some doubtful talk about the customers’ credit-card and stored-value-wallet data being at risk.

Inconsistent data-protection laws

One issue that was raised today was inconsistent data-protection laws that were in place across the globe. An example of this is Australia – the “She’ll Be Right” nation. Compared to the USA and the UK, Australians don’t benefit from data-protection laws that require data-compromise disclosure.

What is needed in a robust data-compromise-disclosure law or regulation is for data-security incidents to the disclosed properly and promptly to the law-enforcement authorities and the end-users.

This should cover what data was affected, which end-users were placed at risk by the security breach, when the incident took place and where it took place

International issues

We also raised the issue of what happens if the situation crosses national borders. Here nations would have to set out practices in handling these incidents.

It may be an issue that has to evolved in the similar way that other factors of international law like extradition, international child-custody/access, and money-laundering have evolved.

Use of industry standards

Customers place trust in brands associated with products and services. The example that we were talking about with the Sony data breach was the Sony name has been well-respected for audio-visual electronics since the 1960s. As well, the PlayStation name was a brand of respect associated with a highly-innovative electronic gaming experience. But these names were compromised in the recent security incidents.

There is a demand for standards that prove the ability for a computing service provider to provide a stable proper secure computing service. Analogies that we raised were those standards that were in place to assure the provision of safe goods like those concerning vehicle parts like windscreens or those affecting the fire-safety rating of the upholstered furniture and soft-furnishings in the hotel that we were in during the afternoon.

Examples of these are the nationally-recognised standards bodies like Standards Australia, British Standards Institute and Underwriters Laboratories. As well there have been internationally-recognised standards bodies like the International Standards Organisation; and industry-driven standards groups like DLNA.

The standards we were focusing on today were the ISO 27001 which covers information security and the ISO 20000 which covers IT service management.

Regulation of standards

Here, the government regulators need to “have teeth” when it comes to assuring proper compliance. This includes the ability to issue severe fines against companies who aren’t handling the data breaches responsibly as well as mitigation of these fines for companies who had an incident but had audited compliance to the standards. This would be demonstrated with evidence of compliant workflow through their procedures, especially through the data incident.

As well, Brahmin had underscored the need for regular auditing of “computing as a service” providers so they can prove to customers and end users that they have procedures in place to deal with data incidents.

I would augment this with the use of a customer-recognisable distinct “Trusted Computing Service Provider” logo that can only be used if the company is compliant the the standards in their processes. The logo would be promoted with a customer-facing advertising campaign that promotes the virtues of buying serviced computing from a compliant provider. This would be the “computing-as-a-service” equivalent of the classic “Good Housekeeping Seal” that was used for food and kitchen equipment in the USA,

Conclusion

What I have taken from this event is that the effort for maintaining a secure computing service is now moving away from the customer who uses the service towards the provider who provides the service. As well, there is a requirement to establish and enforce industry-recognised standards concerning the provision of these services.