Wednesday, September 29, 2010

Social networks posing security threat

Wednesday July 21, 2010

By SYAHRIR MAT ALI



IMMINENT DANGER: According to a Sophos survey in December 2009, 60% of the respondents believed that Facebook presents the biggest security risk compared to other social networking sites - way ahead of MySpace, Twitter and LinkedIn. - AP
THE Internet is a lot more than just a means of staying informed. It has evolved into something much more than what it was originally intended to be.

For some, it is an avenue to avoid the long queues at banks and service counters. For others, it is a place where you can work collaboratively.

But for most, the Web is a communication tool that connects them with family and friends via the many social networking tools.

Most Internet security experts conclude that cyberattacks on social networking sites will increase over the years. Since 2008, Facebook, Twitter, MySpace, LinkedIn, and other such sites have been in the limelight as social networking grew and grew.

These services compete with each other to increase their user base by coming up with mobile tie-ups, applications and games.

All these efforts are worthwhile because social networking sites are the biggest thing on the Internet at the moment, and perhaps for many more years to come. Unfortunately, this trend has also been attracting all sorts of security threats.

New year, new threats

In its 2010 Threat Predictions report, McAfee Labs said it anticipates an increase in threats related to social networking sites such as Facebook.

It also said that criminal tool kits will be evolving rapidly this year to capitalise on new technologies that increase the sophistication of the attack on unsuspecting users.

And, as a result, there is a good chance of an increase in rogue services that exploit Internet users' eagerness to download and install the various and freely available Web 2.0 applications.

According to a Sophos survey in December 2009, 60% of the respondents believed that Facebook presents the biggest security risk compared to other social networking sites - way ahead of MySpace, Twitter and LinkedIn.

Cisco Systems' 2009 Annual Security Report mentioned that the Facebook user base has tripled from 100 million users in 2008 to 350 million in 2009.

There is no doubt that such a huge increase in the number of users within a year's time is phenomenal, and it is attracting cybercriminals from all over the world to migrate their attacks to Facebook.

Mitigating threats

In order to stay safe while using social networking tools (or in fact, other Internet-based applications), users are urged to observe the following practices:

1. Never click on any URL links in unsolicited e-mail (i.e. e-mail that you are not expecting nor asked for);

2. Never log in your online credentials through pages opened up by the URL links you get from any e-mail. In order to be safe, type the URL yourself in the browser. If you have been using shared PCs, be sure not to click on the links provided by the browser bookmarks;

3. Never jot down your online login credentials in an e-mail, even if you think of it as a note to yourself. e-Mail is not the proper place to store your online login credentials. This is to minimise the risks should your e-mail system be compromised;

4. Always verify the validity of the services or links you get via e-mail, even if it appears to be sent by a social networking tool you subscribe to. Google it or better yet, e-mail the service administrators and ask them. Pay extra attention to the given URL as a slight difference would mean a different site altogether;

5. Change the passwords of your online credentials from time to time and do not use the same password for all of them. For a secure password, use a combination of uppercase and lowercase alphabets and numbers, and try to use words that do not exist in any dictionary; and

6. Do not arbitrarily download any updates for your applications. If you really need them, visit the official website and get more information.

Conclusion

It is imperative that Internet users understand that the threats and security issues which come with social networking tools aren't necessarily caused by vulnerabilities in the software or the user's PC … at least, not all the time.

Software vulnerabilities are reported from time to time and they will always be the cornerstone of cybercriminal activities. But for them to work, they have to be initiated by the users themselves in one way or another.

(Syahrir Mat Ali is senior executive of the cybermedia research department at CyberSecurity Malaysia - the national cybersecurity specialist under the Ministry of Science, Technology and Innovation. These are his personal views expressed here.)


source: http://techcentral.my/columns/industryviews/story.aspx?file=/2010/7/21/it_col_industryviews/20100721153614&sec=IT_Columns_IndustryViews

Is the Internet breaking?

Wednesday July 14, 2010



REACHING THE LIMIT: The cost of building the Net is increasing substancially compared to revenue obtained from Internet services -- and this will lead to lesser profits to reinvest in innovation in the future. - AP

By WAN AHMAD KAMAL

THE Internet has become part of our lives faster than any technology in world history. So it's fair to ask - how's it doing? Could it be breaking?

Well, it's certainly creaking! From the first web browsers in the early 1990s to today's 1.7 billion users - the Internet has grown exponentially.

One petabyte can hold about 20 million, four-door filing cabinets full of text - or 500 billion pages of printed text. Using a conservative methodology, growth in petabytes per month will result in Internet traffic growing almost 30-fold by 2020.

While growth is good and drives us forward, and making new things possible, the economics of the Internet are not keeping up with this demand.

In fact, the underlying economic model that built the Internet is breaking. Data from the service providers - those big, global telecommunications carriers that have been building the Internet since the early 1990s - shows that revenue from Internet services has been growing roughly in parallel with the investments required to keep building the Net, allowing service providers to put profits back into Internet extensions.

So far, so good. But the costs to continue building the Net will continue to increase by 18% to 20% every year from now till 2020, while revenue from Internet services will only increase 5% a year in that same timeframe.

Flatter revenue means fewer profits to reinvest in innovation. But, without ongoing innovation, the Internet stagnates -meaning access costs go up, service quality goes down, and growth dies.

We estimate this 'breaking point' may occur as soon as 2014.

Should enterprises care?

Enterprise networks need ongoing innovations in performance, reliability and security just as much as anyone, if not more so.

According to analyst firm Gartner, datacentre consolidation will continue with adoption of more server virtualisation technology and application services, and these will make the datacentre more congested than ever.

Enterprises will also focus on increasing employee productivity with technologies such as unified communication, videoconferencing, etc. And end-users are now demanding more connectivity.

At the same time, enterprise CIOs (chief information officers) are pressured to control operational costs, yet these ambitious goals mean a lot of high-performance networking - in the datacentre, in the campus, and in the branch.

Depending on where you sit in an enterprise, the "Connected Culture" - of device proliferation, content consumption, connected socialisation and machine-to-machine communications - brings a social-media-friendly culture where all things, people, devices, machines, institutions and human knowledge are at our fingertips.

This brings great promise and great peril because it means dealing with demands from an ever increasing variety of users, devices and applications, while increasing performance, security, and reliability.

When it comes to the Internet and our global networks, the Industry "gets it." Industry players know very well that if we don't keep investing in innovation, we're out of business, which is why we know that new approaches are needed for tomorrow's networks.

The typical network architecture found in most datacentres is actually two different networks: A storage area network and an Ethernet backbone.

In both cases there are multiple tiers of autonomous devices that are attempting to co-operate to create "a network." There are also various appliances sprinkled in - applications or security tools trying to solve specific problems.

So, in the end, it is a very complex and expensive environment. This matters, because datacentres drive enterprise networks on the ground, and increasingly, in the cloud.

This kind of complexity is the enemy of performance, security, reliability, and scalability.

Towards a simplified approach

The strategic approach we and the industry have to take, then, is to simplify things and collapse datacentre architectures to enable the exponential scaling in performance, security and reliability needed to meet future networking demands and leverage cloud solutions.

The answer is to virtualise and consolidate security appliances into a single pool of security services then virtualize the access layer - the switches, eliminating the aggregation layer. Then we provide a consistent set of edge services to connect the datacentre to other datacentres, and to the Internet.

The goal is a simple flat datacentre fabric that connects all the elements of the datacentre, yet maintains the simplicity of a single switch, with a single control plane which can scale for cloud-ready applications.

You can then partition this fabric into logical segments, which allows you to virtualise your networks ­- this is essential to achieve exponential gains in performance and scalability - delivering what we call cloud-enabled security.

And because this simplified datacentre fabric operates as a single switch, and is standards-based, it's interoperable for multi-vendor environments.

This is where Juniper sees the datacentre going because there's no other way to scale to meet the network demands enterprises face and truly leverage the cloud.

(Wan Ahmad Kamal is the managing director of Juniper Networks Malaysia, an information technology and computer networking products company)


source: http://techcentral.my/columns/industryviews/story.aspx?file=/2010/7/14/it_col_industryviews/20100714102633&sec=IT_Columns_IndustryViews

Towards better global cybersecurity co-ordination

Thursday April 29, 2010


By Lt Col HUSIN JAZRI (Retired)

I HAVE been been involved in national defence and cybersecurity for many years, and I must say that one of the most important international hurdles that we must overcome together - sooner rather than later - is the inconsistency between what I call the "geographic limitation of sovereign national laws" and its inherent conflict with the borderless nature of the Internet.

The Internet as a whole is too important for the global community, as well as our own national society's well-being and progress, for us not to seek improved ways and means to effectively protect participants - call it an Internet governance agenda, if you will.

The issue is this: The usefulness and effectiveness of any country's legislation is bound by its geographic borders (unless specific inter-country treaties are signed) whereas nearly all of our online activities - social media, peer-to-peer networking, instant messaging, streaming content, even straightforward websurfing, and blog or content hosting - are clearly not.

And as for the career cybercriminals who are behind the online incidents of fraud and forgery, system intrusion, international espionage and hate-motivated international cyberincidents (even spam) - many of them are savvy enough to engage in cross border activities when they know that the so-called "long arm of the law" still isn't long enough yet to catch them.

Our current arrangement for conflict resolution in cyberspace, which is generally based on goodwill and a spirit of inter-agency co-operation between the various countries of the world, is still not sufficient and clearly is wanting in many respects.

This is because the way we operate it now is at odds with the realities of information and communications technology: Our laws are formulated to conform within geographic borders thus we cannot expect to get our way forcing that square peg (border-limited preventive measures) into a round hole (borderless world).

There are many cyberincidents that remain pending because the cross-border nature of the Internet does not lend itself to legislation, particularly those that involve multiple sovereign nations (cybercrime frequently involves not one but multiple transiting countries).

No phishing

Identity theft incidents (like phishing sites targeting local banks) are good examples. In many instances, criminals targeting customers of local financial institutions will host their phishing sites abroad. Indeed, 99% of phishing sites targeting Malaysian financial institutions are hosted outside the country.

This essentially leads to two things in their "favour." The first one is delay of the takedown or removal of the phishing sites due to different time zones and other physical-geographic reasons (language too, sometimes).

Secondly, efforts to obtain information such as log files or contents on the server for investigation purposes are also delayed due to differences in legal provisions. The cybercriminals know this and they have been exploiting the loophole for a long time.

Aside from the geographic limitation of laws, another equally important issue is the difference in cultural interpretation of security versus privacy.

In the West for example, personal privacy is revered, thus any manner of disclosure without the express permission of the information owner is frowned upon. Even the idea of national identification that would make it easier to track citizens is treated with suspicion. However, after 9/11, the balance between security and privacy has probably permanently shifted.

To move forward, I believe there is a pressing need for more formalised international diplomacy channels for cyberspace conflict resolution. Many informed experts have advocated a heightened level of international diplomacy as a way to get each of us to understand our differing points of view; and I am in full agreement.

Personally, I hope to see arbitration at the highest global stage, perhaps at the United Nations level. I envision the UN-like resolution concept to be instituted among member nations - something that would not require anything beyond what has already been physically instituted at many national levels in terms of a national agency.

The arbitration committee can take into account all cyberspace issues and conflicts escalated upward by member national agencies and speed up the execution to resolve issues.

I do not say this lightly because we do not now have a mechanism of redress for even a simple issue, say a woman who wants to address the presence of her photo that showed up on the Web without her permission.

While we all welcome freedom of expression and information on the Internet, I am sure we agree that freedom without responsibility is wrong.

Lt Col Husin Jazri is chief executive officer of CyberSecurity Malaysia, the national cybersecurity specialist under the Ministry of Science, Technology and Innovation.


source: http://techcentral.my/columns/industryviews/story.aspx?file=/2010/4/29/it_col_industryviews/20100429165243&sec=IT_Columns_IndustryViews

Green manufacturing: Past, present and future

Thursday April 29, 2010


THE Copenhagen Environment and Climate Summit of December 2009 shed new light on our planet, the global warming issue and the responsibility of every nation, company and individual towards reducing emissions.

Over the next decade, the United Nations and member countries are committing billions of dollars to work through many of the carbon emission challenges that burden our environment. Today, and closer to home, the green and sustainability trend appears to be infiltrating every aspect of manufacturing, consumer behaviour and new product designs as the attention turns towards this rising global issue.

In the manufacturing industry, analysts seem to be the most excited about the green and sustainability movement, followed by software vendors and manufacturers.

This seems somewhat unbalanced because manufacturers have the potential to reap the largest benefits. As good citizens, manufacturers should be leading the way rather than being guided by analysts or software vendors.

This doesn't necessarily mean that manufacturers are not aware of the benefits or aren't thinking about them. In fact, the number of manufacturers looking for software solutions related to green and sustainability has risen significantly in the last few months.

Unfortunately the majority of the inquiries are still externally driven and most of them relate to anticipated mandatory reporting requirements rather than reduction. There has not been a realisation that some of the trends emerging would warrant a fundamental shift in the way value chains are designed and managed today.

All the data points indicate that energy efficiency should be an obvious "must-have" solution for any manufacturing company embracing energy efficiency and sustainability initiatives.

Using an overseas example in a 2004 study the US Department of Energy pointed out that the manufacturing industry can achieve practical energy reductions of about 20%, worth almost US$19bil (RM65bil).

Coupled with the fact that the manufacturing sector consumes almost as much energy as the transportation sector, it is surprising that every US-based manufacturing company is not proactively embracing energy efficiency and sustainability initiatives.

To add to these data points, there is a compelling case for cost reduction:
• During the automation wave of 1980s, the focus was on direct labour cost. Companies automated operations, until they extracted the last scintilla of manual labour from the manufacturing and materials movement process;
• During the ERP (enterprise resource planning) and supply chain-planning wave of 1990s, the focus shifted to material cost. Manufacturers realised that labour cost constituted only about 5%-15% of total cost in many industries. It made more sense to focus on raw material cost which was the largest proportion of end product cost; and
• Extending this logic further, "Energy Cost" should be the next frontier of cost improvement as firms move to more sustainable platforms and the carbon-content of products gains prominence.

So what happens?

Where does the line of reasoning break down? First, based on the current accounting practices, energy cost most likely constitutes only about 5% of the product cost.

Second, there also exist numerous secondary cost factors involved in the "total energy" costs of a product - like the delivery and fuel costs to have raw materials, sub-component and subcontracted supplies delivered for final assembly.

These are all directly linked to the outsourcing trend and the relative "low" financial cost of transportation.

Third, consider also the petrol consumed for the workers to get to the plant as well as the rubbish, waste, water and sewage of massive factory facilities.

As a result, "all-inclusive" energy costs can be in excess of 10% with all the related overhead, hidden-secondary cost factors included.

Cost and margin pressures will continue in all sectors of the manufacturing economy. A real concern for green initiatives is that most manufacturers do not know the detail of current energy usage and the potential impact of those rising energy costs on their businesses.

This is because energy is considered and accounted for as an overhead cost. Product value chains are not typically viewed from an energy or carbon perspective today.

Currently, there are only scattered attempts to allocate energy cost accurately from department, cost centre or product perspectives. Energy directors and green executives still have a corporate perspective towards green and sustainability that is based on a facilities orientation.

Manufacturers that are serious about capitalising on green and sustainability as a competitive advantage - instead of just a reporting obligation - need to change quickly. There is an opportunity to move to the "next phase" that embraces the change instead of barely surviving it.

This next phase is critical and will decide the winners and losers in the new global carbon economy. Some of the steps needed in this phase are radical and will require strategic planning that needs to start now.

The same tools

Energy efficiency is the base-level requirement to qualify for the next phase. To achieve measurable savings in energy consumption, manufacturers must examine internal processes for efficiency gains.

Most energy consumption takes place inside manufacturing processes, where energy is consumed by heavy equipment and machinery.

That means, a company may change all the light bulbs in its buildings and paint the office buildings green, but that would only impact 18% of its energy footprint.

The good news is that the tools and technology required for equipment energy monitoring are the same ones that have been available in recent years for general operational efficiency monitoring and intelligence.

This provides further impetus to invest in an operational intelligence platform that assembles and contextualises live data from equipment and assets on the plant floor.

Once internal facilities and processes have been modeled and optimised, the next step is to migrate this energy perspective to the entire value chain.

Contextualising energy usage and identifying products that account for an enterprise's GHG (greenhouse gas) emissions is not enough. It is also important to account for suppliers' processes and their carbon footprint.

For example, if a manufacturer sells products through retailers or is part of a supply chain that does, it is likely to be hit sooner than others. Among reasons given for going green, is that large retailers are likely to mandate manufacturer labelling of products for their carbon footprint. This trend will likely spread to others.

After the value chain has been re-modelled to account for energy consumption and carbon footprint, the next step is to broaden the initiative to other aspects of sustainability.

Energy or natural resources that contain carbon is just the beginning, as true sustainability requires manufacturers to optimise their natural resources footprint in general, including water and landfill, etc.

Green and sustainability present both a threat and opportunity to industries in general and manufacturers in particular.

Energy efficiency is a necessary step for manufacturers, but alone it is not sufficient. Instead, to be successful and emerge as winners in the carbon economy, companies need to take a value chain model-based approach and continuously broaden and deepen that model to drive global and local optimisations in order to maintain and increase competitive advantage.

Further, companies need to broaden their approach to green and sustainability and account for all aspects of sustainability, not just energy consumption and carbon emissions.

K. Raman is regional managing director (Asean) of Oracle Corporation's Asia Pacific division. The multinational computer technology corporation specialises in developing and marketing enterprise software products.

Points to ponder when building a datacentre

Tuesday September 7, 2010



SIMPLE SOLUTION: A Sun Microsystems datacentre in Santa Clara. Rather than building datacentres all over the region, local businesses should opt for remote access solutions such as VPN to enable employees, partners and customers in other countries to access the datacentre in Malaysia. - AP

By K. RAMAN

RECENTLY, one of our customers was faced with a dilemma. His is a growing business and while Malaysia remains the headquarters with key corporate functions, it is his operations in Indonesia, Singapore and Vietnam that are growing. Should he build datacentres in each market? Or should he look at outsourcing his datacentre needs in each market?

It is not an uncommon dilemma for Malaysian-based regional businesses. Scalability, cost-efficiencies and management are their top three key considerations. I would add flexibility and speed of implementation to the decision-making process as well.

On the lowest end of the cost spectrum, businesses will just scale up their datacentres in Malaysia and rent fat pipes to connect to their operations outside of the country.

At the other end of the cost spectrum, there will be those who will build a datacentre in each city to support each market. This may not make economic sense. Beside the costs of building and deploying datacentres, they account for a sizable portion of real estate footprint and energy bills.

Both options, in the long term, are not cost-effective and will not scale well to support the growing business needs.

A simple solution is to use remote access solutions such as VPN (virtual private networks) that enable your employees, partners and customers in other countries to access the datacentre in Malaysia without introducing additional infrastructure.

Furthermore, there is an emerging trend towards building a datacentre that offers on-demand access to a shared pool of configurable computing resources to support growing business needs from different regions quickly and when required.

This model is a possible long-term strategy, if properly managed, as it can offer high computing resources at lower costs, help IT attain efficiencies, and maintain flexibility and agility.

The first steps

To start, you will need to determine the requirements that cover the entire computing spectrum: Infrastructure, systems, applications and services.

With your needs in mind, begin to define the architecture blueprint needed to meet these requirements. You will then need to meet with various vendors and solutions providers to evaluate technologies and products for building these hardware and software stacks in the architecture blueprint.

We always advise our customers to implement reference architecture to validate key design areas in the architecture blueprint. This reference architecture will also help to examine how well your blueprint meets the requirements. And where there are gaps, refine the blueprint, if applicable.

Once you have determined the right blueprint, the next step is to adopt a phased implementation approach adhering to the architecture, using the right tools and technologies.

In undertaking this entire process, there are three key considerations to bear: 1. Design considerations: You should develop a design that is efficient, flexible, agile, scalable and manageable; 2. Business alignment: Both IT and lines of businesses will need to discuss and concur how the datacentre can meet and support the current and future business goals; and, 3. Execution strategy: Have the right people and processes to build the datacentre that meets both the technical and business considerations while using the most appropriate technologies.

The execution process is critical in building a datacentre. You will need to work through the associated risks involved and clearly identify the requirements and resources needed to implement each architecture component.

Engaging the right resources, whether in-house or with an outsourced party, to work on each component is important.

In-house vs. outsourced

Should you staff up your in-house IT resources or outsource your IT needs? The choice will depend on a few factors such as your company's requirements as well as the in-house skill set and the cost for outsourcing.

Based on experience and observations, I would argue that a combination of in-house and outsourced resources will yield the desired results as this combination gives the "best-of-breed" skills and experiences needed for successful implementation.

The onus is on in-house resources to be involved in key project stages and to have a strong understanding of the design and implementation plan to drive the project forward.

The in-house resources will ensure that the datacentre is built in accordance with the architecture blueprint and will leverage on outsourced resources to carry out tasks in an efficient and effective manner.

Outsourced resources are services experts who can transform your selected products and technology into meaningful business results, minimise the risks associated with complex IT projects by utilising their technical and project management expertise, as well as help you to save time and money by implementing technology using global best practices.

Importantly, for the long run, outsourced services experts can help you optimise your datacentre.

K. Raman is regional managing director (Asean) of Oracle Corporation's Asia Pacific division. The multinational computer technology corporation specialises in developing and marketing enterprise software products.


source: http://techcentral.my/columns/industryviews/story.aspx?file=/2010/9/7/it_col_industryviews/20100907162405&sec=IT_Columns_IndustryViews

Data protection vital for SMBs

tuesday September 7, 2010



BEING VIGILANT: According to a survey, two thirds of the IT staff time of SMBs are spent working on information protection, including computer security, backup, recovery and archiving as well as disaster preparedness. - AP

By ERIC HOH

EVERY day, around the world, people from all walks of life use information to make important decisions on a variety of matters. This is true across the broad section of human activity, more so in areas related to economic and commercial endeavour.

The dependence of economies on information is obvious, with modern-day business and commercial activities relying on using information and the ever-widening communications networks.

These activities include critical areas that provide fundamental services to the public, such as banking, utilities, and transportation.

The smooth-running of these sectors have a heavy bearing on public welfare and national security, hence the importance of undisrupted information delivery and availability cannot be underestimated.

Information must not only remain safe but must be accessible to the right people at the right time. Budgets must be met, whilst productivity levels are maintained or improved.

What's more, compliance with regulatory requirements for the proper handling of sensitive information must be achieved and verified, and the existing resources should be optimised.

This applies to all businesses, small and medium businesses (SMBs) included. Any loss of information critical to operations can have an impact on businesses, regardless of its size. As such, putting in place systems to ensure that information is always protected and available should be a priority for SMBs.

Challenges for SMBs

It is an uncomfortable thought when you consider that an SMB's most valuable information could become its greatest loss in a matter of seconds. An SMB's reputation, relationships and time are critical to success and downtime or lost of information can cause irreparable damage.

The statistics bear this out - Symantec's 2010 Global SMB Information Protection Survey found that small and medium businesses are now making protecting their information their highest IT priority, as opposed to 15 months ago when a high percentage had failed to enact even the most basic safeguards.

This shift makes sense as SMBs are facing increased threats from cyberattacks, lost devices and loss of confidential or proprietary information. The average SMB globally now spends approximately US$51,000 (RM163,000) a year to protect its information.

Malaysian SMBs surveyed rank data loss (78%) and cyberattacks (55%) as their top business risks, and they showed a heightened interest and increased investment in information protection.

Two thirds of IT staff time are now spent working on information protection, including computer security, backup, recovery and archiving as well as disaster preparedness.

Cost has often been cited as a big barrier to adequate protection by SMBs. Growing storage costs are commonly blamed on the avalanche of information that needs to be secured, managed and retained.

However, backup need not be costly and in a large number of cases, SMBs are not using the storage resources they have efficiently, which leads to unnecessary spiralling costs.

To protect their information effectively and quickly, SMBs need a simple but powerful backup system that can automate backups at a reasonable cost. With simplified data protection approach, SMBs can reduce the complexity, save time and increase reliability and availability, all at the same time.

Technologies such as disk-based backup, snapshot backups, data deduplication, continuous data protection, and cloud-based backup options can help SMBs address data protection concerns.

For SMBs, money and staff time are at a premium, and there will always be something more pressing to do than manage backups. However, as the volume of information increases, so does the risk to a company's bottom line if that information is not protected.

Smart investments

Ensuring that information is well managed and easily recoverable is essential for keeping business productive and profitable. Without proper protection, it could take days, if not weeks, and require significant expenditure to recover important business information from an infrastructure failure, natural disaster, or simple human error.

There are many solutions available in the market designed to address backup and recovery challenges that constantly put SMBs under increasing pressure, especially when staff and resources are limited.

SMBs should look for solutions that offer simplified management, the ability to scale as their business grows, and make the most of their current investments to ensure that backup cost does not surge.

SMB environments may not need the scale provided by enterprise backup and recovery solutions, but they do need much of the functionality. That means policy-based backups, automated operations, and centralised management should be key design to help lightly staffed SMBs effectively manage system and data protection operations.

Ultimately, it is critical for SMBs to prioritise their investment on solutions that provide enterprise-class protection with minimal administration, so that they can focus their resources, time and budget on the success of their business.

Eric Hoh is vice-president for Asia South region at Symantec Corporation which helps organisations secure and manage their information


source : http://techcentral.my/columns/industryviews/story.aspx?file=/2010/9/7/it_col_industryviews/20100907162619&sec=IT_Columns_IndustryViews

Building a cloud-ready datacentre network

Wednesday September 22, 2010



WAY TO GO: Previously, IT hardware and software were acquired and physically provisioned on site. However, cloud computing represents a new way to deliver and consume services on a shared network and IT infrastructure. - AP

By WAN AHMAD KAMAL

IT CAN be a daunting task to interconnect a growing number of virtual and physical devices while trying to simplify the network to manage these resources at scale. This article looks at three critical areas that companies should focus on.

Cloud computing represents a new way to deliver and consume services on a shared network and IT infrastructure. Previously, IT hardware and software were acquired and physically provisioned on site.

With cloud computing, the value of these same software and hardware products are delivered on-demand in the form of services over the network.

Cloud computing is not only relevant to network service providers or Internet-based service providers offering cloud services to customers. Enterprise or public sector IT organisations are becoming acutely aware of cloud computing's relevance to their own internal operations.

It is now possible for IT to build out private clouds or augment their resources with public clouds that enable their datacentres to benefit from this powerful computing model.

The lessons learned from cloud computing can vastly improve the scale, agility, and application service levels of enterprise datacentres as well as reduce costs. Achieving these results requires close examination of the network itself, which is the foundation of the cloud-ready datacentre.

Management complexity increases exponentially as more devices are added. This often necessitates physical segmentation, which runs counter-intuitive to building large, shared resource pools that maximise economies of scale.

Overcoming these obstacles requires a fundamental shift in the way enterprise IT organisations build-out their legacy datacentre networks. Success in building a scalable, cloud-ready datacentre network requires following three critical steps: (1) Simplify, (2) Share and (3) Secure.

Simplify

Simplification starts with reducing the number of autonomous devices. In the future, a single logical switch will be able to scale securely and reliably across the datacentre to connect all servers, storage and appliances.

Until that happens, interim measures can be taken to consolidate network layers, increase scale and performance without adding complexity and reduce costs:

• Leverage device density to reduce the number of physical devices;

• Employ technologies that enable multiple physical devices to act as one logical device;

• Reduce layers of switching to two or less;

• Ensure reliable routing connections into and out of the datacentre; and

• Maintain a common operating system and a single point to monitor and manage the network with open APIs.

Share

With a simpler, scalable network to support large resource pools, the next step enables the dynamic sharing of resources for greater agility. This necessitates virtualisation at two levels - the virtualisation of servers, storage and appliances, and the virtualisation of the network itself.

Virtualisation minimises the need for physical segmentation, allows capacity and bandwidth to be shared efficiently and flexibly for multi-tenancy and high quality of service. VLANs, zones, MPLS and VPLS offer effective ways to virtualise the network within and between enterprise datacentres.

Secure

Another challenge involves maintaining trusted environments and scaling security for pooled resources. To complement the simplification and sharing of the cloud-ready datacentre, the security services should also be consolidated and virtualised. It is vital to secure data and services at rest and in transit using these and other security measures:

• Secure flows into the datacentre. Authenticate and encrypt connections to network endpoints (SSL) and enterprise devices (IPSec) while reducing device proliferation. It is also essential to prevent denial-of-service attacks and deploy firewalls to guard the edge and perimeter;

• Secure flows within the datacentre. Segment the network with VLANs, zones, virtual routers and VPNs, and use firewalls to protect application-to-application traffic - between servers, between virtual machines and between pods. Also employ application aware and identity-based security policies; and

• Set network-wide policies from a central location to ensure security compliance. Centralised reporting engines provide historical and real-time visibility into applications and data, and enable IT to perform scheduled vulnerability assessments.

Conclusion

By rethinking traditional legacy approaches and preparing for the advent of cloud computing, it is possible for IT organisations to build datacentre networks that offer greater economies of scale, improved application service levels, simpler management and lower costs.

Simplifying, sharing and securing the network are critical to achieving success in building-out cloud-ready datacentres.

As Moore's Law ensures that technological advances continue to make cloud-ready datacentre networks a reality, IT organisations can take decisive steps today that drive businesses closer to the promise of tomorrow.

(Wan Ahmad Kamal is the managing director of Juniper Networks Malaysia, an information technology and computer networking products company)


source : http://techcentral.my/columns/industryviews/story.aspx?file=/2010/9/22/it_col_industryviews/20100922132735&sec=IT_Columns_IndustryViews