Categories
Challenges & Trends

Confronting Challenges in Identity and Access Management

In order to protect consumer and institutional assets, the financial services industry is heavily regulated across state, federal and international agencies. Significant requirements exist when it comes to protection of financial data.

Therefore, while identity & access management is of critical importance across all industries, this is particularly true in the financial services arena. Identity & Access Management (IAM) involves ensuring that the right individuals have access to the right data.

A robust identity and access management system controls individual access to specific data according to role and conflict avoidance requirements. Moreover, the best systems ensure that information access can be audited, monitored, logged, and reported as necessary. Just in time provisioning also plays a vital role, to ensure people gain – or lose – access as quickly as possible. The former ensures better corporate efficiency, while the latter minimizes security risks.

Yet, even the best IAM systems fail to remedy all challenges faced by BFSI entities in today’s rapidly evolving technology environment. The following are some of the trickiest challenges pertaining to identity management, and how the most forward thinking entities are tackling them.

Work from Anywhere

In the past, employees who wanted to access corporate data worked from one location: on-premise. As a result, security systems followed a so called perimeter strategy, which focused exclusively on testing and perfecting security systems inside the physical plant. Employees typically did their work from an assigned space within a specific location.

Today, users work from various locations: home, a board room, an airport, a hotel. And regardless of their locations, employees need and expect access to protected data to do their job. While the specifics of IAM might not change that much by location independence, greater risk of corporate identity theft exists outside the traditional work environment. The likelihood of unauthorized access increases as people log in from across the street or around the globe.

In order to mitigate this threat, IAM now requires use analytics and review to identify and prevent unusual use patterns. Repeated failed logins, random device access and repeated attempts at non-provisioned information after login are all suggestions that an access identity may be compromised. By using strong analytics and detection systems, atypical logins are identified sooner, allowing for a speedier resolution.

Bring Your Own Device

Related, but not identical to the work from anywhere trend is the bring your own advice initiative. The greatest challenge is often associated with mobile devices… cell phones and tablets in particular… are the different security protocols created by different manufacturers. Ensuring a uniliateral security protocol across all devices takes time, planning and testing, which can often stress an already busy IT department.

However, BYOD is here to stay. Employees often want 24 x 7 by anywhere access to enterprise applications, such as email, which may contain protected information. As a result, even inadvertent loss of mobile devices presents a great risk of confidential information falling into the wrong hands. Notification delays that frequently accompany lost devices further exacerbate such a risk.

Strict IAM doesn’t adequately protect against mobile threats. Another layer of access control needs to accompany traditional sign on methods. Yet, the more arduous the protocol, the less satisfied the user. Dissatisfaction often results in lack of compliance, including the saving or forwarding of passwords in unsecured applications, such as notepad or personal email.

A common solution to this issue is so called “smart authentication” which recognizes login location through network identification. A rules and exception based protocol allows for the simplification of login rules depending on the location of the access request. When a user is in an approved location, login protocols can be simplified. However, when users are in unlisted locations, additional security measures are applied.

Conclusion

The best identity and access management systems stay ahead of challenges presented by our rapidly evolving economy. They enable not only rapid provisioning and decommissioning, but also provide a layer of defense against common risks in the banking, financial services and insurance industries. By investing in robust IAM, BFSI enterprises can ensure they remain in compliance with regulatory requirements, protect consumer information and thwart cyber-theft. It is those organizations that will gain or retain a competitive advantage in the coming years.

Want to learn more about developing and implementing robust IAM solutions? Contact us today.

Categories
Challenges & Trends

Big Data and Monitoring the Cloud

The cloud is becoming an increasingly popular way for companies across all industries to reduce costs related to their IT infrastructure. By embracing the cloud, companies are able to reduce technology infrastructure costs, innovate more rapidly and engage in more responsive client relationships.

However, operating in the cloud requires a different approach to monitoring and optimization.

Why Monitoring Must Change

In the past, applications often ran on specific hardware, physically separated from other dissimilar IT resources. Different technology, languages and platforms were common, but generally of little consequence. Siloes were frequent, but didn’t impact monitoring, load or performance. Each team developed monitoring techniques and skills expertise uniquely beneficial to their specific needs.

In today’s cloud environment, different applications using different platforms and different technologies are often in the same stack. This differentiated environment requires coordination between team owners that was previously unnecessary.

In a traditional data center, it was common and acceptable for development teams to generally ignore infrastructure. Dedicated server teams took responsibility for the physical environment, while application teams monitored app and end user performance. However, in a cloud scenario, development teams have end-to-end responsibility, which includes the backend infrastructure.

Why? Application changes – which happen more quickly in the cloud – may have unanticipated repercussions on application, environment or end user performance across the cloud. Degraded performance is no longer limited to one team. It can affect the entire organization.

As a result, there is now a need to facilitate communications between employees responsible for host, network, application and end-user monitoring. From service level data, such as errors or response times, to real-user monitoring, including end-user response times and errors to behavioral data and log monitoring need to be freely available amongst unrelated entities. Without improved communications strategies, transparency and visibility will be lacking, and problems increasingly likely.

Potential solution – Big Data

In order to facilitate the collection and sharing of performance metrics at each particular layer (infrastructure, application, user), organizations must invest in both automation and technical expertise. It is unlikely that a single tool will suffice to bring necessary data points together.

However, by feeding data streams from different solutions into a common big data back end, greater transparency is available for processing and analysis.

By implementing a big data monitoring protocol, insight into the health of the cloud environment becomes more accurate, more available through decreased collection cycles and more meaningful to a greater number of users in a shorter amount of time.

Moreover, because big data is uniquely qualified to consolidate and make sense of data points from numerous sources in various formats, it enables the detection, and even the prediction of performance issues.

Combining big data with cloud monitoring mitigates operational risk, reduces costs and helps maximize the potential of the cloud, ensuring return on investment and increased operational efficiency.

Want to learn more about cloud migrations, including big data projects for monitoring? Contact us today.

Categories
Challenges & Trends

Big Data’s Increasing Role in Financial Services

Big data is a pretty popular buzzword in the tech space these days. And while it has well established implications for marketing, product development and client retention, financial services organizations are also looking at big data to support core business functions related to asset and trade management, risk management, regulatory compliance and information security.

In a world where data is growing by exabytes per year, financial services organizations that leverage big data find themselves at a significant competitive advantage. By aggregating and analyzing near real-time data from across departments, geographic locations and third party vendors, financial services entities are able to reduce operational risk, maximize profitability, ensure regulatory compliance and fight fraud and other information security risks.

1. Trading and operations

Big Data enables the aggregation and analysis of real-time data from multiple internal and third party sources prior to trade execution. While historically, trade analysis involved some level of “gut”, today, more and more algorithms are being written to capitalize on data processing, often within micro-seconds.

The development of real-time enterprise modeling and analytics platforms frequently aggregate data extracts from all operational, third party and non-traditional data sources, including news and social media. Business users and analysts are then able to explore the data and develop analytic business models. By building an enterprise-wide self-service analytics platform, users are provided with controlled access to explore data when and how they need it. As a result, users obtain readily available, actionable insight, leading to higher margins and greater profits.

While real time trade data is certainly of critical importance in this process, historical transactional data also plays a big role in trade operations. Institutions can leverage historical data and market movement information to feed trading and predictive models and forecasts. By analyzing past performance, better predictive analyses can be established for future business modeling.

Through big data optimization, the front office gains greater insight into trade risk and exposure, counterparty reliability and cash management. As a result, they have the potential to improve profitability, efficiency and cost-containment with greater ability than ever before.

2. Regulatory Compliance

But big data isn’t only a front office initiative. Regulatory compliance is a critical issue in the financial services world, and is changing rapidly. In an environment where 1) regulations change frequently 2) adherence is complicated by detailed requirements and 3) the penalties for violating regulations increasingly costly, there is greater and greater attention to this area of risk management.

In the regulatory arena, fines for non-compliance can add up to millions of dollars, increasing the desire for tools which protect against violations. As a result, investment in big data solutions which proactively defend the institution are seen as important strategic decisions.

Institutions can use big data to measure regulatory compliance by combining regulatory data with supporting documents, contracts, attestation and transactional information. By monitoring unstructured and unrelated content, including IM chats, emails, and telephone calls, and combining them algorithmically with trading activity data and documentation, compliance teams are forewarned of potential conflicts putting the trader and the organization at risk.

3. Fraud Management

A financial services organization gets exposed to different kinds of fraud that can result in millions of dollars of losses. Predictive analytics tools are used to build models to detect and prevent fraud. Data is correlated across different sources to identify fraudulent behavior across unrelated data points. While credit card fraud detection based on location, prior shopping history and expense patterns are known to many, these capabilities can also be used in asset management scenarios.

These solutions operate in real-time and utilize in-memory technologies to analyze hundreds of terabytes of data for a real-time transaction to detect fraud.

By correlating seemingly unrelated incidents to identify fraud with greater speed and accuracy, financial institutions are able to reduce exposure and reduce liability for fraudulent losses.

Conclusion

Big Data plays an increasingly important roll across the front, middle and back office of financial institutions. By taking advantage of previously unavailable insights, custodians are better able to both serve their customers while capitalizing on business operations and profitability.

To learn how big data can help your BFSI organization, contact SRI Infotech today.

Categories
Challenges & Trends

Picking a pilot process for RPA

When an organization is considering the implementation of a new technology, they want to identify the benefits before making a large investment. Therefore, running a pilot program is extremely important as it allows an organization to test the solution on a small scale and decide if the initiative suits their needs. Keep in mind, a pilot is different than a proof of concept (POC) but more on that another day.

Anyway, choosing the best pilot process is the most important part of a successful robotics operation. Many companies want to begin with automating their pain points, however, in many cases this is not the best place to start for a pilot process. When doing a pilot, you want to ensure the process you choose will be able to demonstrate the value of RPA and not select a project that will bury you in development or process re-engineering. In this article we will outline seven key features of a good candidate for a pilot.

Start small

A good way to start an RPA project is to choose a smaller process to build up knowledge and required capabilities before scaling up. However, the pilot process cannot be too simple, remember that it should be relevant and bring benefits to your organization.

Keep it simple

Avoid selecting processes that are too complex, or ones with multiple paths, look for something simple, where the happy path covers at least 80%. Complicated processes increase the risk of delays due to the different types of exceptions. Such processes usually have several people involved, which may make it too difficult to map each step properly. A better choice would be to select a process that requires only one subject matter expert to described in detail.

Minimize the level of effort

Another important factor is ease of automation. Although this may be difficult to determine, especially at the beginning of an RPA journey. An experienced RPA lead developer can perform an application assessment to estimate the level of effort required. They are familiar with the capability of the tool and can identify potential issues and present possible solutions. Typically processes where we need to interact with virtual environments, such as Citrix, are not suitable for pilot projects as they need to utilize surface automation techniques. Instead choose easier to automate applications such as browser or windows-based applications.

Use digital and structured data

Processes chosen for the automation project should have input data supplied in digital and structured form. If the data does not come directly from the target application or system, consider creating an example excel template to simplify the ingestion of the data for the digital worker. Look for processes with structured data inputs, for example, XLS, XML, CSV, JSON. Also, avoid processes that rely on data from scanned documents!

Ensure processes follows strict rules

Avoid picking processes with poorly defined/documented rules and procedures or ones that require intuition. The RPA developer requires precisely described procedures, down to the click and type level, to train the digital worker. Finally remember to document accurately each step of the process, provide input data format and list all types of business exceptions.

Maximize the number of available test cases

To maximize the chances for a successful RPA solution, test the process with enough test cases to cover as many production scenarios as possible. In the User Acceptance Testing (UAT) phase ensure that test plans accounts for at least one full day’s worth of data. Running these tests properly will drastically limit any surprises in production

Identify the business value

As mentioned before, a pilot process should be rather small and simple, however keep in mind it’s relevance. Look for direct benefits for the organization such as employee optimization, decrease of throughput time, or reduction of errors. Avoid selecting any critical process, as risk of any mistakes could be unacceptable. Target no-risk or low-risk processes, keep the human in the loop having them make final decisions, and ensure the benefits are quantifiable with minimal effort.

Summary

There are no hard rules on how to choose a pilot project, however, following these tips should make life a bit easier and drastically increase the chances of success for an RPA initiative. A digital worker is similar to a new employee. Just as one would train a new employee to perform a task, a digital worker also requires a similar approach to such training. Not to oversimplify things, but when looking at potential pilot projects ask, “Is this task something we would allow a new employee to do in their first couple weeks?” If not, chances are there is a reason why that process might not be a best choice to kick off an RPA initiative! Additionally, consider working with a partner like us! Our experience with looking at potential processes, identifying their automation complexity, and estimating the level of effort required to implement. Plus, after the assessment, we have resources available to assist you further, or even complete the project from start to finish 😊.

Categories
Challenges & Trends

Big Data’s Increasing Role in Financial Services

Big data is a pretty popular buzzword in the tech space these days. And while it has well established implications for marketing, product development and client retention, financial services organizations are also looking at big data to support core business functions related to asset and trade management, risk management, regulatory compliance and information security.

In a world where data is growing by exabytes per year, financial services organizations that leverage big data find themselves at a significant competitive advantage. By aggregating and analyzing near real-time data from across departments, geographic locations and third party vendors, financial services entities are able to reduce operational risk, maximize profitability, ensure regulatory compliance and fight fraud and other information security risks.

1. Trading and operations

Big Data enables the aggregation and analysis of real-time data from multiple internal and third party sources prior to trade execution. While historically, trade analysis involved some level of “gut”, today, more and more algorithms are being written to capitalize on data processing, often within micro-seconds.

The development of real-time enterprise modeling and analytics platforms frequently aggregate data extracts from all operational, third party and non-traditional data sources, including news and social media. Business users and analysts are then able to explore the data and develop analytic business models. By building an enterprise-wide self-service analytics platform, users are provided with controlled access to explore data when and how they need it. As a result, users obtain readily available, actionable insight, leading to higher margins and greater profits.

While real time trade data is certainly of critical importance in this process, historical transactional data also plays a big role in trade operations. Institutions can leverage historical data and market movement information to feed trading and predictive models and forecasts. By analyzing past performance, better predictive analyses can be established for future business modeling.

Through big data optimization, the front office gains greater insight into trade risk and exposure, counterparty reliability and cash management. As a result, they have the potential to improve profitability, efficiency and cost-containment with greater ability than ever before.

2. Regulatory Compliance

But big data isn’t only a front office initiative. Regulatory compliance is a critical issue in the financial services world, and is changing rapidly. In an environment where 1) regulations change frequently 2) adherence is complicated by detailed requirements and 3) the penalties for violating regulations increasingly costly, there is greater and greater attention to this area of risk management.

In the regulatory arena, fines for non-compliance can add up to millions of dollars, increasing the desire for tools which protect against violations. As a result, investment in big data solutions which proactively defend the institution are seen as important strategic decisions.

Institutions can use big data to measure regulatory compliance by combining regulatory data with supporting documents, contracts, attestation and transactional information. By monitoring unstructured and unrelated content, including IM chats, emails, and telephone calls, and combining them algorithmically with trading activity data and documentation, compliance teams are forewarned of potential conflicts putting the trader and the organization at risk.

3. Fraud Management

A financial services organization gets exposed to different kinds of fraud that can result in millions of dollars of losses. Predictive analytics tools are used to build models to detect and prevent fraud. Data is correlated across different sources to identify fraudulent behavior across unrelated data points. While credit card fraud detection based on location, prior shopping history and expense patterns are known to many, these capabilities can also be used in asset management scenarios.

These solutions operate in real-time and utilize in-memory technologies to analyze hundreds of terabytes of data for a real-time transaction to detect fraud.

By correlating seemingly unrelated incidents to identify fraud with greater speed and accuracy, financial institutions are able to reduce exposure and reduce liability for fraudulent losses.

Conclusion

Big Data plays an increasingly important roll across the front, middle and back office of financial institutions. By taking advantage of previously unavailable insights, custodians are better able to both serve their customers while capitalizing on business operations and profitability.

To learn how big data can help your BFSI organization, contact SRI Infotech today.

Categories
Challenges & Trends

Technology Trends For The Manufacturing Industry

 

manufacturing-smLike all industries, the manufacturing sector is in the midst of a technological revolution. From the cloud to automation to big data, recent advancements in information technology are helping manufacturing companies improve their plant productivity, their customer relations and their bottom line.

Yet, technology encompasses a lot of ground. Some up and coming trends have true capacity to change the shop floor. Others are less helpful. That said, over 1/3 of manufacturers anticipate a boost in profits through the improved use of technology over the following year.

It is therefore critical for manufacturers to investigate technological advancements with the greatest potential upside, or risk being left behind.

Which technology trends have the greatest potential for manufacturing companies? Here are the 4 that will have the greatest impact in 2017.

Computer Aided Manufacturing

People often confuse automation with robotics. However, in the most advanced manufacturing settings, a combination of computer assisted production and a skilled, autonomous workforce create real time improvements that impact the bottom line.

By utilizing preconfigured software applications that define batched manufacturing requirements, companies are able to improve reliability, reduce errors and execute projects more easily. Use of automated processes in manufacturing enables delivery of products to market within shorter timeframes, leading to greater business opportunities, reduced costs and heightened profitability.

Artificial Intelligence

By embedding smart sensors in machinery, artificial intelligence enables failure monitoring and prediction in real time. By identifying potential areas of failure before catastrophic collapse, companies are able to reduce downtime and maintenance costs. Maintenance activities can be scheduled on an as needed basis, allowing for longer, more accurate run times. Moreover, by stopping production before a true mechanical failure occurs, repairs are often less costly and time consuming. Artificial intelligence helps ensure production schedules remain on time and that inevitable mechanical issues have less impact.

Big Data

Even the simplest manufacturing environments involve complex interactions between production activities. As a result of this complexity, diagnosing and correcting process flaws can often be complicated and time consuming. However, by utilizing big data, unanticipated correlations can be discovered and improved upon, leading to greater opportunity for reduced costs and increased profits.

Big data allows upper management to aggregate and analyze previously isolated data and inputs to reveal important insights. By using big data, manufacturers can better their production process, improve sales and customer retention and develop more targeted marketing efforts.

Cloud Computing

Manufacturers are always looking for new ways to streamline operations and improve productivity and profitability. Cloud computing plays an important role in meeting those goals.

By using cloud-based systems, manufacturers are consolidating sales and marketing efforts, developing, prototyping and launching new computer aided production paradigms, collecting and benefiting from analytics (See big data above), and revising product offerings more quickly than ever. The cloud provides an opportunity to grow computing capabilities without costly investments in infrastructure, physical or technical.

Conclusion

Technology is rapidly changing the manufacturing industry in numerous ways. As a result, making strategic IT investments is essential for continued growth. Aggregating different strategies in diverse operations – while difficult – allows for the greatest returns in an increasingly competitive global marketplace.

Contact SRI Infotech to help you determine which technological innovations will best serve your manufacturing business.

Related Content:

Case study: Manufacturing Efficiency Through Technology

Categories
Challenges & Trends

Big Data and Monitoring the Cloud

The cloud is becoming an increasingly popular way for companies across all industries to reduce costs related to their IT infrastructure. By embracing the cloud, companies are able to reduce technology infrastructure costs, innovate more rapidly and engage in more responsive client relationships.

However, operating in the cloud requires a different approach to monitoring and optimization.

Why Monitoring Must Change

In the past, applications often ran on specific hardware, physically separated from other dissimilar IT resources. Different technology, languages and platforms were common, but generally of little consequence. Siloes were frequent, but didn’t impact monitoring, load or performance. Each team developed monitoring techniques and skills expertise uniquely beneficial to their specific needs.

In today’s cloud environment, different applications using different platforms and different technologies are often in the same stack. This differentiated environment requires coordination between team owners that was previously unnecessary.

In a traditional data center, it was common and acceptable for development teams to generally ignore infrastructure. Dedicated server teams took responsibility for the physical environment, while application teams monitored app and end user performance. However, in a cloud scenario, development teams have end-to-end responsibility, which includes the backend infrastructure.

Why? Application changes – which happen more quickly in the cloud – may have unanticipated repercussions on application, environment or end user performance across the cloud. Degraded performance is no longer limited to one team. It can affect the entire organization.

As a result, there is now a need to facilitate communications between employees responsible for host, network, application and end-user monitoring. From service level data, such as errors or response times, to real-user monitoring, including end-user response times and errors to behavioral data and log monitoring need to be freely available amongst unrelated entities. Without improved communications strategies, transparency and visibility will be lacking, and problems increasingly likely.

Potential solution – Big Data

In order to facilitate the collection and sharing of performance metrics at each particular layer (infrastructure, application, user), organizations must invest in both automation and technical expertise. It is unlikely that a single tool will suffice to bring necessary data points together.

However, by feeding data streams from different solutions into a common big data back end, greater transparency is available for processing and analysis.

By implementing a big data monitoring protocol, insight into the health of the cloud environment becomes more accurate, more available through decreased collection cycles and more meaningful to a greater number of users in a shorter amount of time.

Moreover, because big data is uniquely qualified to consolidate and make sense of data points from numerous sources in various formats, it enables the detection, and even the prediction of performance issues.

Combining big data with cloud monitoring mitigates operational risk, reduces costs and helps maximize the potential of the cloud, ensuring return on investment and increased operational efficiency.

Want to learn more about cloud migrations, including big data projects for monitoring? Contact us today.

Categories
Challenges & Trends

Confronting Challenges in Identity and Access Management

In order to protect consumer and institutional assets, the financial services industry is heavily regulated across state, federal and international agencies. Significant requirements exist when it comes to protection of financial data.

Therefore, while identity & access management is of critical importance across all industries, this is particularly true in the financial services arena. Identity & Access Management (IAM) involves ensuring that the right individuals have access to the right data.

A robust identity and access management system controls individual access to specific data according to role and conflict avoidance requirements. Moreover, the best systems ensure that information access can be audited, monitored, logged, and reported as necessary. Just in time provisioning also plays a vital role, to ensure people gain – or lose – access as quickly as possible. The former ensures better corporate efficiency, while the latter minimizes security risks.

Yet, even the best IAM systems fail to remedy all challenges faced by BFSI entities in today’s rapidly evolving technology environment. The following are some of the trickiest challenges pertaining to identity management, and how the most forward thinking entities are tackling them.

Work from Anywhere

In the past, employees who wanted to access corporate data worked from one location: on-premise. As a result, security systems followed a so called perimeter strategy, which focused exclusively on testing and perfecting security systems inside the physical plant. Employees typically did their work from an assigned space within a specific location.

Today, users work from various locations: home, a board room, an airport, a hotel. And regardless of their locations, employees need and expect access to protected data to do their job. While the specifics of IAM might not change that much by location independence, greater risk of corporate identity theft exists outside the traditional work environment. The likelihood of unauthorized access increases as people log in from across the street or around the globe.

In order to mitigate this threat, IAM now requires use analytics and review to identify and prevent unusual use patterns. Repeated failed logins, random device access and repeated attempts at non-provisioned information after login are all suggestions that an access identity may be compromised. By using strong analytics and detection systems, atypical logins are identified sooner, allowing for a speedier resolution.

Bring Your Own Device

Related, but not identical to the work from anywhere trend is the bring your own advice initiative. The greatest challenge is often associated with mobile devices… cell phones and tablets in particular… are the different security protocols created by different manufacturers. Ensuring a uniliateral security protocol across all devices takes time, planning and testing, which can often stress an already busy IT department.

However, BYOD is here to stay. Employees often want 24 x 7 by anywhere access to enterprise applications, such as email, which may contain protected information. As a result, even inadvertent loss of mobile devices presents a great risk of confidential information falling into the wrong hands. Notification delays that frequently accompany lost devices further exacerbate such a risk.

Strict IAM doesn’t adequately protect against mobile threats. Another layer of access control needs to accompany traditional sign on methods. Yet, the more arduous the protocol, the less satisfied the user. Dissatisfaction often results in lack of compliance, including the saving or forwarding of passwords in unsecured applications, such as notepad or personal email.

A common solution to this issue is so called “smart authentication” which recognizes login location through network identification. A rules and exception based protocol allows for the simplification of login rules depending on the location of the access request. When a user is in an approved location, login protocols can be simplified. However, when users are in unlisted locations, additional security measures are applied.

Conclusion

The best identity and access management systems stay ahead of challenges presented by our rapidly evolving economy. They enable not only rapid provisioning and decommissioning, but also provide a layer of defense against common risks in the banking, financial services and insurance industries. By investing in robust IAM, BFSI enterprises can ensure they remain in compliance with regulatory requirements, protect consumer information and thwart cyber-theft. It is those organizations that will gain or retain a competitive advantage in the coming years.

Want to learn more about developing and implementing robust IAM solutions? Contact us today.

Categories
Challenges & Trends

Cloud Computing in BFSI

These days, many businesses are investigating the cloud. However, until recently banking, financial services and insurance companies weren’t among them.

In the past, banks, asset management firms and other players in the BFSI ecosystem were reluctant to consider cloud environments, citing privacy concerns, security fears, regulatory requirements and the complexity of integrating internal and third party legacy systems.

However, over the past few years, banks are revisiting their prior conclusions.

Categories
Challenges & Trends

Fighting Cybercrime With Big Data

Traditionally, cybercrime has been fought through infrastructure. Better firewalls. Better passwords. Better identity management. Cybercrime has historically been the province of technology departments. But is it enough?

The BFSI Cybercrime Environment

Not in today’s day and age. Every day we hear new stories about online financial fraud or other data breaches. According to the 2015 KPMG Cybercrime Survey Report, 72% of all companies have faced a cyberattack in the past year, with 63% of them suffering a financial loss as a result. For 70% of companies in banking and financial services, the direct costs of fraud are as high as 7 basis points. For an institution with 100 billion dollars in assets, that represents a 70 million dollar a year loss.