Categories
Case studies

Technology & Securities Accounting Risk Management

Financial management institutions seek to maximize profits and minimize risk. For institutional entities in banking and finance, a strong securities accounting system plays a critical role in ensuring that all parties to a transaction are properly informed, collateralized and protected from costly errors that decrease trade completion, increase risk of penalties and otherwise inhibit the smooth transaction processing necessary for success in today’s fast paced business environment.

There are several non-market risk factors frequently encountered by companies engaged in institutional securities management:

Risk 1: Trade Entry Errors

Frequently, disagreements between trade parties result from errors as simple as mistakenly entered terms or added extra zeros. Risk of erroneous data entry is particularly high when trade terms change several times over a short period of time.

Yet, simple as these errors are, they often go undiscovered until time of settlement. When payment expectations – either out or in – are not in alignment, discovering the root cause of the discrepancy… which could date back several weeks… wastes precious time and labor resources.

Risk 2: Disparate Valuation & Reconciliation Expectations

Valuation is of critical importance in maintaining the integrity of securities trades. Yet, manual entry of trade value calculations can lead to parties having different expectations of returns or outflows over a given period in time.

Discrepancies can result from calculation errors resulting from either mistakenly entered formulae or use of inconsistent pricing data.

Risk 3: Counterparty Risk Exposure

How stable are your transaction partners? How likely are they to complete their end of the transaction? Are their securities maintaining their valuation? Are the assets they pledged as collateral sufficient to mitigate risk, or is further collateralization required? The stability and reliability of trade partners plays a big role in deciding whether to see trades through to completion, or if environmental factors require compromising profit for risk minimization.

Risk 4: Collateralization

Similar to counterparty risk exposure, trade collateralization helps reassure trade partners that agreements are likely to complete, and serve as reassurance against counterparty default. Yet, collateral is only as good as its availability. If collateral is pledged multiple times against various trades, risk mitigation is severely decreased or eliminated altogether.

Failure to adequately collateralize trades weakens not only relations between trade partners, but also the market environment as a whole.

Risk 5: Reporting compliance

While not as high profile as the other risks, failure to report trade valuations increases operational costs. Required by exchanges and governmental regulations, inability to submit valuation of trade activities by mandated deadlines leads to the imposition of fines and penalties, and development of a poor reputation. Yet, each of the other four risks, if unaccounted for, increases the likelihood of missed reporting timeframes. Failure to report accurately and on time leads to expenses that can undercut profitability in a highly competitive environment.

Technology For Risk Mitigation in BFSI

Technology based solutions provide the opportunity to mitigate each of the above mentioned five risks, as well as many others. By combining trade documentation, trade details, 3rd party valuation data, standardized pricing calculations and asset tracking, along with internal and external reporting capabilities, customer account access and audit management, a technology based solution has the opportunity to mitigate both non-trade and trade based risk.

Mitigating Valuation Risk

For example, by integrating 3rd party valuation services and publishing daily position calculation reports, both party and counterparty are notified of anticipated payments over various timeframes, given a particular date and timestamp. This provides both parties, not only with common data from which to make educated decisions regarding trade status and future action, but also the opportunity to recognize disparities. By fixing valuation disparities closer in time to trade origination, organizations prevent payment processing delays and ensure that errors don’t follow forward for significant periods of time.

Moreover, by allowing for the storage of trade documents within the transaction management system, origination documentation can be accessed and evaluated by both parties in near real time, preventing delays related to search efforts caused by inefficient, unshared data management protocols. Unlike document retention facilitated by email and fax, in this system, both parties are assured of looking at the most recent document available, with the same terms and conditions.

Additionally, discovering the source of errors through auditing capabilities… be it clerical, technical or structural… allows for the prevention of future errors, enhancing future productivity and profitability.

This combination of technology fixes to the traditional trade management environment has the potential to reduce discrepancies and resulting delays from 40% to 3 – 4%, while additionally increasing the speed of valuation resolution.

Mitigating Counterparty Risk

Technology can similarly level the playing field between parties, ensuring that counterparty risk is sufficient to ensure comparable exposure for both parties. Real time collateral validation ensures that particular assets haven’t been pledged multiple times for different transactions, and counterparty credit information helps determine whether a true meeting of the minds is possible between parties.

By making these assessments part of an integrated system, rather than parsing them out to another team with different timeframes, priorities and frameworks, transaction time assessments are possible, ensuring that deals are secure at their inception. Further, continuous monitoring – and reporting – of details including collateralization, financial position and stability allow for real time decisions about trading profitability for risk optimization, should conditions surrounding the original trade change.

Technology As Equalizer

The days of managing financial transactions using Excel spreadsheets, email and shared drives is over. Increasing speed of business, smaller opportunities for error and globalization require securities management to look to financial technology to control risks to the greatest extent possible.

By merging multiple disparate, but overlapping functions in one system, deal makers on both sides of a transaction have increased transparency, risk mitigation and business opportunity. A strong securities management system helps control both operational and market factors, promoting business security, reliability and profitability.

Today, financial technology for securities asset trade management is a must. Is your organization ready?

Contact SRI Infotech to learn more about how we can help you improve your financial trade operations today.

Categories
Case studies

WebRTC: Efficiency, Loyalty & Flexibility

Over the past few years, WebRTC – or web real time communications – has started to gain traction in both business and technology environments. For the first time, companies can enable in-browser or in-application communications – from chat to voice to video to document exchange, directly in the browser, without the need for additional downloads, plugins or other barriers to fast and seamless interactions.

WebRTC is a game-changer when it comes to customer service, providing a seamless experience for the customer at the very time and in the very manner in which they dictate. Because it enables contextual continuity through various escalation levels, WebRTC focuses on customer satisfaction by valuing their time as well as their preferred method for engagement.

Additionally, WebRTC merges together sometimes disparate back end communications systems, allowing for greater enterprise efficiency and cost savings, as well as better customer engagement. By enabling single authentication and communications continuity between service levels, customers often feel better heard, which fosters brand loyalty.

SRI Infotech has worked on several projects related to WebRTC, and has first hand experience in seeing how this relatively new technology can assist your business in it’s daily operaons.

Case Study 1: Automobile Insurance Provider

Recently, SRI Infotech was asked to help a large automotive insurance provider build a mobile device for submission of automobile accident claims. The end user application not only allows the customer to open and submit a claim from a mobile device, but also to collect photographic evidence related to the claim.

When the driver is ready, he or she can then connect via mobile with an insurance agent in real time to review claims details.

After the connetion is initiated by the customer, the insurance agent is able to interact with the customer in real time via chat, voice or video. Additionally, the agent has the ability to take control of the camera so as to take photos which better optimize the evidence necessary for evaluating the claim.

Evidence collected through the app is stored to the cloud and affiliated with the claim number, providing a consolidated repository of all information necessary for making informed coverage decisions. The collection of all information allows for the processing of claims in hours rather than days.

Case Study 2: Technology Manufacturer & Support

Additionally, SRI Infotech worked with a large, international technology manufacturer and servicer to assist in development of a WebRTC based support infrastructure. The goal of the project – still in the proof of concept stage – better facilitates onsite support by connecting the “person on the ground” with specialists in the corporate support center.

The system provides both offline ticket creation and documentation, as well as streamlined, real time communications between support staff in different locations. As a result, issues can be tracked, escalated and managed in a more timely manner. Photos of error codes and videos of faulty functionality provide out-of-office support staff with a greater level of detail with which to troubleshoot the problem. Moreover, by using photographs of equipment serial numbers and sku barcodes, problem attribution can be accurately tracked to particular pieces of equipment and locations.

Overall, this system enables faster resolution rates for tickets that can be resolved same day, on site, as well as the opportunity to order replacement parts or equipment in real time if fixes require more complicated methods.

Conclusion

By providing cost effective, flexibility through the browser or an application, WebRTC has the potential to streamline customer support and service operations. Moreover, by easing interactions with clients on their time, under their preferred communication method and by providing a , WebRTC strengthens customer loyalty. Businesses in all industries need to embrace WebRTC as part of their initiative to remain effective and competitive in today’s high stakes environment.

Is your company ready for a WebRTC initiative? Contact us today to learn more.

Categories
Challenges & Trends

Confronting Challenges in Identity and Access Management

In order to protect consumer and institutional assets, the financial services industry is heavily regulated across state, federal and international agencies. Significant requirements exist when it comes to protection of financial data.

Therefore, while identity & access management is of critical importance across all industries, this is particularly true in the financial services arena. Identity & Access Management (IAM) involves ensuring that the right individuals have access to the right data.

A robust identity and access management system controls individual access to specific data according to role and conflict avoidance requirements. Moreover, the best systems ensure that information access can be audited, monitored, logged, and reported as necessary. Just in time provisioning also plays a vital role, to ensure people gain – or lose – access as quickly as possible. The former ensures better corporate efficiency, while the latter minimizes security risks.

Yet, even the best IAM systems fail to remedy all challenges faced by BFSI entities in today’s rapidly evolving technology environment. The following are some of the trickiest challenges pertaining to identity management, and how the most forward thinking entities are tackling them.

Work from Anywhere

In the past, employees who wanted to access corporate data worked from one location: on-premise. As a result, security systems followed a so called perimeter strategy, which focused exclusively on testing and perfecting security systems inside the physical plant. Employees typically did their work from an assigned space within a specific location.

Today, users work from various locations: home, a board room, an airport, a hotel. And regardless of their locations, employees need and expect access to protected data to do their job. While the specifics of IAM might not change that much by location independence, greater risk of corporate identity theft exists outside the traditional work environment. The likelihood of unauthorized access increases as people log in from across the street or around the globe.

In order to mitigate this threat, IAM now requires use analytics and review to identify and prevent unusual use patterns. Repeated failed logins, random device access and repeated attempts at non-provisioned information after login are all suggestions that an access identity may be compromised. By using strong analytics and detection systems, atypical logins are identified sooner, allowing for a speedier resolution.

Bring Your Own Device

Related, but not identical to the work from anywhere trend is the bring your own advice initiative. The greatest challenge is often associated with mobile devices… cell phones and tablets in particular… are the different security protocols created by different manufacturers. Ensuring a uniliateral security protocol across all devices takes time, planning and testing, which can often stress an already busy IT department.

However, BYOD is here to stay. Employees often want 24 x 7 by anywhere access to enterprise applications, such as email, which may contain protected information. As a result, even inadvertent loss of mobile devices presents a great risk of confidential information falling into the wrong hands. Notification delays that frequently accompany lost devices further exacerbate such a risk.

Strict IAM doesn’t adequately protect against mobile threats. Another layer of access control needs to accompany traditional sign on methods. Yet, the more arduous the protocol, the less satisfied the user. Dissatisfaction often results in lack of compliance, including the saving or forwarding of passwords in unsecured applications, such as notepad or personal email.

A common solution to this issue is so called “smart authentication” which recognizes login location through network identification. A rules and exception based protocol allows for the simplification of login rules depending on the location of the access request. When a user is in an approved location, login protocols can be simplified. However, when users are in unlisted locations, additional security measures are applied.

Conclusion

The best identity and access management systems stay ahead of challenges presented by our rapidly evolving economy. They enable not only rapid provisioning and decommissioning, but also provide a layer of defense against common risks in the banking, financial services and insurance industries. By investing in robust IAM, BFSI enterprises can ensure they remain in compliance with regulatory requirements, protect consumer information and thwart cyber-theft. It is those organizations that will gain or retain a competitive advantage in the coming years.

Want to learn more about developing and implementing robust IAM solutions? Contact us today.

Categories
Challenges & Trends

Big Data and Monitoring the Cloud

The cloud is becoming an increasingly popular way for companies across all industries to reduce costs related to their IT infrastructure. By embracing the cloud, companies are able to reduce technology infrastructure costs, innovate more rapidly and engage in more responsive client relationships.

However, operating in the cloud requires a different approach to monitoring and optimization.

Why Monitoring Must Change

In the past, applications often ran on specific hardware, physically separated from other dissimilar IT resources. Different technology, languages and platforms were common, but generally of little consequence. Siloes were frequent, but didn’t impact monitoring, load or performance. Each team developed monitoring techniques and skills expertise uniquely beneficial to their specific needs.

In today’s cloud environment, different applications using different platforms and different technologies are often in the same stack. This differentiated environment requires coordination between team owners that was previously unnecessary.

In a traditional data center, it was common and acceptable for development teams to generally ignore infrastructure. Dedicated server teams took responsibility for the physical environment, while application teams monitored app and end user performance. However, in a cloud scenario, development teams have end-to-end responsibility, which includes the backend infrastructure.

Why? Application changes – which happen more quickly in the cloud – may have unanticipated repercussions on application, environment or end user performance across the cloud. Degraded performance is no longer limited to one team. It can affect the entire organization.

As a result, there is now a need to facilitate communications between employees responsible for host, network, application and end-user monitoring. From service level data, such as errors or response times, to real-user monitoring, including end-user response times and errors to behavioral data and log monitoring need to be freely available amongst unrelated entities. Without improved communications strategies, transparency and visibility will be lacking, and problems increasingly likely.

Potential solution – Big Data

In order to facilitate the collection and sharing of performance metrics at each particular layer (infrastructure, application, user), organizations must invest in both automation and technical expertise. It is unlikely that a single tool will suffice to bring necessary data points together.

However, by feeding data streams from different solutions into a common big data back end, greater transparency is available for processing and analysis.

By implementing a big data monitoring protocol, insight into the health of the cloud environment becomes more accurate, more available through decreased collection cycles and more meaningful to a greater number of users in a shorter amount of time.

Moreover, because big data is uniquely qualified to consolidate and make sense of data points from numerous sources in various formats, it enables the detection, and even the prediction of performance issues.

Combining big data with cloud monitoring mitigates operational risk, reduces costs and helps maximize the potential of the cloud, ensuring return on investment and increased operational efficiency.

Want to learn more about cloud migrations, including big data projects for monitoring? Contact us today.

Categories
Case studies

Maximizing Cost Savings In The Cloud

One of the primary business drivers for enterprises migrating to the web is cost savings. Generally speaking, cloud migrations have the potential to reduce allocations related to staffing, security and infrastructure. However, that doesn’t mean that migrations are cost free. Regardless of the move towards the cloud, there are real costs that need to be included in migration budgets.

According to Forrester, customer facing applications – or so called systems of engagement – are the easiest to move to the cloud. These applications are generally developed with new code bases, and utilize more modern infrastructures, paradigms and languages. Because they are more modern, and are developed for the web and mobile, they are inherently easier to migrate to the cloud when necessary.

Moreover, because engagement applications are often tied to marketing and consumer retention strategy, they are increasingly developed in the cloud from the beginning. By developing in the cloud, companies are able to capitalize on reduced development time, iterative testing and upgrades and faster rollouts. For these applications, the cloud works because speed is of the essence. With native cloud development, migration costs are negated, while revenue streams potential increase on release.

Migrating Enterprise Applications

The real challenge for migration goals can be found in migration of legacy tools. These mission critical applications usually have to comply with strict security, administration, privacy, maintenance and uptime requirements. According to Forrester, they are frequently “old and either completely or substantially custom-built.” As a result, they often require “substantial revision to achieve cloud’s primary benefits—on-demand scaling, pay-for-what-you-use economics, global availability, and high security.”

It’s this redevelopment of legacy applications – by developers familiar with both the cloud and the application itself – that accounts for the greatest expense in cloud migrations. As detailed in the report, “[L]abor costs dwarf infrastructure and platform services costs in most of the migration projects we’ve reviewed.” Once all necessary team members are accounted for – the strategists, developers, project managers, and compliance and risk analysts – labor costs often exceed 50% of total migration costs.

Managing Migration Costs

Given the significant up front labor costs associated with migration of legacy applications, some question whether they should be attempted at all. Should cloud migrations only occur when existing infrastructure upgrades are imminent?

Not necessarily. Waiting for an perfectly opportune time to upgrade ignores the future cost savings… including of labor… that cloud migrations enable.

Upon completion of cloud migrations, labor costs often diminish significantly. Among global enterprises, approximately 67% of the IT budget is spent on labor for application development and maintenance, hosting, security and end user support. These costs can be significantly reduced if not eliminated after migrating to the cloud. In a typical scenario, person to server ratios improve from 1:5 to 1:10 or even 1:15.

When combined with subsequent cost reductions pertaining to infrastructure, physical plant, utilities, agility and capital expenditures, cloud migration projects are still likely to result in significant return on investment.

Go Global

That said, smart institutions want to reduce costs as much as possible from the get go.

To further reduce migration costs, organizations should consider enterprise migration planning. By establishing an enterprise migration initiative, and creating interdisciplinary teams that specialize in migration planning and implementation, organizations develop clear visions for migration goals, establish governance models that define roles and responsibilities within and across teams and define performance benchmarks. Centralizing business, oversight and technical expertise in one team ultimately results avoidance of duplication of work, and savings through economies of scale.

With an enterprise strategy, migration planning and processing becomes more predictable, more time effective and less labor intensive with each subsequent project. By using knowledge gained in one project to inform another, organizations can develop migration plans that capitalize on existing experience and identify potential pitfalls more readily.

Conclusion

Labor costs are significant factors in determining return on investment for cloud migrations. While they, do decline over subsequent projects as teams become more familiar with migration protocols, they needs to be accurately accounted for during the planning phase.

Categories
Case studies

Manufacturing Efficiency Through Technology

Companies in the highly competitive manufacturing sector need to find points of operational advantage in all business endeavors, including on the shop floor. Computer-aided Manufacturing (CAM) – the use of computerized systems to control machining operations – is a frequently implemented improvement.

By utilizing software applications that define batched, predefined manufacturing configurations, companies are able to improve reliability, reduce errors and execute projects more easily. Use of automated processes in manufacturing enables delivery of products to market within shorter time-frames, leading to greater business opportunities, reduced costs and heightened profitability.

Case Study Background

Our client is an industry-leading manufacturer of components sold to Original Equipment Manufacturers (OEMs) in the building products industry. In their daily operations, floor operators use tiger saws to make precise cuts for the creation of window and door frames.

Although they had been utilizing a home grown system to facilitate job management, it required significant manual configuration for each job, thus increasing time to completion and cost. They sought to improve their cutting quality accuracy, reduce mistakes and product waste, and enhance overall productivity and efficiency with an updated infrastructure.

SRI Infotech worked with the client to help develop a state of the art computer aided manufacturing system, built on ASP.Net. The user interfaces – from the shop floor to the management suite – utilize Microsoft Windows workstations.

Details and operations

The developed system is divided into 4 hierarchical components, based on user access and functionality.

Floor operators utilize the software in the day to day course of operations. Managers & reviewers establish critical settings and define configurations. System administrators not only determine settings and configurations, but also have access to overall data, and the ability to set user permissions.

Each saw within the plant is connected to local web services. Particular attention was paid to developing an intuitive user interface, allowing floor operators to initiate the production sequence by selecting the job checkbox. Job details – previously entered into the system by the system administrator – including material length, cut angle and other technical specifications – are then sent to the particular saw through web services, initiating the production process. Operators additionally have the ability to cancel or pause individual jobs as necessary.

Big Data

In addition to sending instructions out, the bidirectional system also collects information about the order, including status, length of operation and completion.

Jobs data is subsequently collected in a central Oracle MS SQL database, and is used in various business processes, including, quality control, operational efficiency calculations, alerts, business intelligence and accounting. Each job is subsequently archived as well.

Access to aggregated project data enabled to the client to gain insight into prevailing market trends, enabling quicker strategy changes for better profitability. Additionally, by conducting meta-analyses of previously siloed data, the client was able to recognize opportunities for improved efficiency across seemingly disparate operations to reduce costs.

Results

By investing in computer aided manufacturing, SRI Infotech’s client was able to improve reliability, reduce errors and execute projects more easily. Moreover, the client was able to collect data pertaining to market demand, allowing them to better capitalize on business opportunities in shorter timeframes.

As a result of embracing new technologically aided manufacturing processes, the client saw improved production, profitability and future potential.

If your company is interested in learning how they can use technology to maximize their earning potential, please contact SRI Infotech today.

Categories
Case studies

OTC Derivatives Self-Service

Most OTC derivatives applications used by financial institutions around the world share the same characteristics. Third party data collection, trade position calculation and audit trails are common features.

Yet, they lack transparency and real time information. Obtaining reports and data frequently require the participation of technologists, reducing efficiency and transparency, and increasing costs. While these systems are significant improvements over the old Excel spreadsheet days, they are insufficient for today’s marketplace.

Our client, a global custodial bank and financial institution, sought to improve their back office operations with improved software that met their business needs in an increasingly competitive business landscape.

Project Goals

The primary goal of this project was to develop self-service modules for employees in the following departments related to OTC derivatives trade: pricing, accounting, corporate action, trade directives and information delivery. Prior to the development of this self-service system, and as recently as 2 or 3 years ago, traders relied on developers to run reports related to trade position, as well as to fix trades that were either erroneously entered or that lacked requisite information due to lags in price feed transmission.

About The Technology – Pricing & Accounting

The OTC self-service system allows business end users to log into the accounting center and upload not only trade details, but their own prices for trades when 3rd party vendor valuations are either incorrect or delayed. Subsequent to trade entry, the pricing system automatically coordinates with the accounting function to ensure proper payment and collection data.

Moreover, authorized traders are also able to run position reports, not only at the trade level, but also at the lot and leg level. This capability provides granular data about who is trading specific securities, aggregated payor and payee data and fix and float information across the enterprise. The availability of enterprise-wide data protects the client against over-leverage scenarios.

By creating these trade related self service capabilities, end users gained increased independence to obtain the information when and how they needed it, while the client avoided repetitive development costs.

About The Technology – Information Delivery

After trade completion, the priority switched to trade directives, trade position, and risk analysis and management. To help facilitate improved transparency, SRI Infotech developed an all in one data sheet pertaining to trades, trade parties and ownership position.

Additionally, to improve information delivery to both internal and external stakeholders, SRI Infotech developed an additional self-service automated report delivery module. After a one time setup in which business end users define the client, the funds and other information, customized data is automatically delivered directly to internal and external stakeholders through a secure FTP protocol. As a result, authorized business users can customize data obtained from over 6,000 extracts to ensure that they receive only that information they need at the exact time they need it.

Permissions for different data streams are defined by senior management on a client by client basis to ensure adherence to conflict avoidance rules and regulations.

Conclusion

Through the use of automation and self-service applications, SRI Infotech assisted the client in empowering its traders to work more independently and efficiently, while simultaneously improving transparency and cutting development costs and institutional risk. In the highly competitive OTC derivatives market, this provided the client with a significant competitive advantage.

Let SRI Infotech help you maximize productivity on your trading floor. Contact us today and learn more.

Categories
Challenges & Trends

Big Data’s Increasing Role in Financial Services

Big data is a pretty popular buzzword in the tech space these days. And while it has well established implications for marketing, product development and client retention, financial services organizations are also looking at big data to support core business functions related to asset and trade management, risk management, regulatory compliance and information security.

In a world where data is growing by exabytes per year, financial services organizations that leverage big data find themselves at a significant competitive advantage. By aggregating and analyzing near real-time data from across departments, geographic locations and third party vendors, financial services entities are able to reduce operational risk, maximize profitability, ensure regulatory compliance and fight fraud and other information security risks.

1. Trading and operations

Big Data enables the aggregation and analysis of real-time data from multiple internal and third party sources prior to trade execution. While historically, trade analysis involved some level of “gut”, today, more and more algorithms are being written to capitalize on data processing, often within micro-seconds.

The development of real-time enterprise modeling and analytics platforms frequently aggregate data extracts from all operational, third party and non-traditional data sources, including news and social media. Business users and analysts are then able to explore the data and develop analytic business models. By building an enterprise-wide self-service analytics platform, users are provided with controlled access to explore data when and how they need it. As a result, users obtain readily available, actionable insight, leading to higher margins and greater profits.

While real time trade data is certainly of critical importance in this process, historical transactional data also plays a big role in trade operations. Institutions can leverage historical data and market movement information to feed trading and predictive models and forecasts. By analyzing past performance, better predictive analyses can be established for future business modeling.

Through big data optimization, the front office gains greater insight into trade risk and exposure, counterparty reliability and cash management. As a result, they have the potential to improve profitability, efficiency and cost-containment with greater ability than ever before.

2. Regulatory Compliance

But big data isn’t only a front office initiative. Regulatory compliance is a critical issue in the financial services world, and is changing rapidly. In an environment where 1) regulations change frequently 2) adherence is complicated by detailed requirements and 3) the penalties for violating regulations increasingly costly, there is greater and greater attention to this area of risk management.

In the regulatory arena, fines for non-compliance can add up to millions of dollars, increasing the desire for tools which protect against violations. As a result, investment in big data solutions which proactively defend the institution are seen as important strategic decisions.

Institutions can use big data to measure regulatory compliance by combining regulatory data with supporting documents, contracts, attestation and transactional information. By monitoring unstructured and unrelated content, including IM chats, emails, and telephone calls, and combining them algorithmically with trading activity data and documentation, compliance teams are forewarned of potential conflicts putting the trader and the organization at risk.

3. Fraud Management

A financial services organization gets exposed to different kinds of fraud that can result in millions of dollars of losses. Predictive analytics tools are used to build models to detect and prevent fraud. Data is correlated across different sources to identify fraudulent behavior across unrelated data points. While credit card fraud detection based on location, prior shopping history and expense patterns are known to many, these capabilities can also be used in asset management scenarios.

These solutions operate in real-time and utilize in-memory technologies to analyze hundreds of terabytes of data for a real-time transaction to detect fraud.

By correlating seemingly unrelated incidents to identify fraud with greater speed and accuracy, financial institutions are able to reduce exposure and reduce liability for fraudulent losses.

Conclusion

Big Data plays an increasingly important roll across the front, middle and back office of financial institutions. By taking advantage of previously unavailable insights, custodians are better able to both serve their customers while capitalizing on business operations and profitability.

To learn how big data can help your BFSI organization, contact SRI Infotech today.

Categories
Challenges & Trends

Picking a pilot process for RPA

When an organization is considering the implementation of a new technology, they want to identify the benefits before making a large investment. Therefore, running a pilot program is extremely important as it allows an organization to test the solution on a small scale and decide if the initiative suits their needs. Keep in mind, a pilot is different than a proof of concept (POC) but more on that another day.

Anyway, choosing the best pilot process is the most important part of a successful robotics operation. Many companies want to begin with automating their pain points, however, in many cases this is not the best place to start for a pilot process. When doing a pilot, you want to ensure the process you choose will be able to demonstrate the value of RPA and not select a project that will bury you in development or process re-engineering. In this article we will outline seven key features of a good candidate for a pilot.

Start small

A good way to start an RPA project is to choose a smaller process to build up knowledge and required capabilities before scaling up. However, the pilot process cannot be too simple, remember that it should be relevant and bring benefits to your organization.

Keep it simple

Avoid selecting processes that are too complex, or ones with multiple paths, look for something simple, where the happy path covers at least 80%. Complicated processes increase the risk of delays due to the different types of exceptions. Such processes usually have several people involved, which may make it too difficult to map each step properly. A better choice would be to select a process that requires only one subject matter expert to described in detail.

Minimize the level of effort

Another important factor is ease of automation. Although this may be difficult to determine, especially at the beginning of an RPA journey. An experienced RPA lead developer can perform an application assessment to estimate the level of effort required. They are familiar with the capability of the tool and can identify potential issues and present possible solutions. Typically processes where we need to interact with virtual environments, such as Citrix, are not suitable for pilot projects as they need to utilize surface automation techniques. Instead choose easier to automate applications such as browser or windows-based applications.

Use digital and structured data

Processes chosen for the automation project should have input data supplied in digital and structured form. If the data does not come directly from the target application or system, consider creating an example excel template to simplify the ingestion of the data for the digital worker. Look for processes with structured data inputs, for example, XLS, XML, CSV, JSON. Also, avoid processes that rely on data from scanned documents!

Ensure processes follows strict rules

Avoid picking processes with poorly defined/documented rules and procedures or ones that require intuition. The RPA developer requires precisely described procedures, down to the click and type level, to train the digital worker. Finally remember to document accurately each step of the process, provide input data format and list all types of business exceptions.

Maximize the number of available test cases

To maximize the chances for a successful RPA solution, test the process with enough test cases to cover as many production scenarios as possible. In the User Acceptance Testing (UAT) phase ensure that test plans accounts for at least one full day’s worth of data. Running these tests properly will drastically limit any surprises in production

Identify the business value

As mentioned before, a pilot process should be rather small and simple, however keep in mind it’s relevance. Look for direct benefits for the organization such as employee optimization, decrease of throughput time, or reduction of errors. Avoid selecting any critical process, as risk of any mistakes could be unacceptable. Target no-risk or low-risk processes, keep the human in the loop having them make final decisions, and ensure the benefits are quantifiable with minimal effort.

Summary

There are no hard rules on how to choose a pilot project, however, following these tips should make life a bit easier and drastically increase the chances of success for an RPA initiative. A digital worker is similar to a new employee. Just as one would train a new employee to perform a task, a digital worker also requires a similar approach to such training. Not to oversimplify things, but when looking at potential pilot projects ask, “Is this task something we would allow a new employee to do in their first couple weeks?” If not, chances are there is a reason why that process might not be a best choice to kick off an RPA initiative! Additionally, consider working with a partner like us! Our experience with looking at potential processes, identifying their automation complexity, and estimating the level of effort required to implement. Plus, after the assessment, we have resources available to assist you further, or even complete the project from start to finish 😊.

Categories
Challenges & Trends

Big Data’s Increasing Role in Financial Services

Big data is a pretty popular buzzword in the tech space these days. And while it has well established implications for marketing, product development and client retention, financial services organizations are also looking at big data to support core business functions related to asset and trade management, risk management, regulatory compliance and information security.

In a world where data is growing by exabytes per year, financial services organizations that leverage big data find themselves at a significant competitive advantage. By aggregating and analyzing near real-time data from across departments, geographic locations and third party vendors, financial services entities are able to reduce operational risk, maximize profitability, ensure regulatory compliance and fight fraud and other information security risks.

1. Trading and operations

Big Data enables the aggregation and analysis of real-time data from multiple internal and third party sources prior to trade execution. While historically, trade analysis involved some level of “gut”, today, more and more algorithms are being written to capitalize on data processing, often within micro-seconds.

The development of real-time enterprise modeling and analytics platforms frequently aggregate data extracts from all operational, third party and non-traditional data sources, including news and social media. Business users and analysts are then able to explore the data and develop analytic business models. By building an enterprise-wide self-service analytics platform, users are provided with controlled access to explore data when and how they need it. As a result, users obtain readily available, actionable insight, leading to higher margins and greater profits.

While real time trade data is certainly of critical importance in this process, historical transactional data also plays a big role in trade operations. Institutions can leverage historical data and market movement information to feed trading and predictive models and forecasts. By analyzing past performance, better predictive analyses can be established for future business modeling.

Through big data optimization, the front office gains greater insight into trade risk and exposure, counterparty reliability and cash management. As a result, they have the potential to improve profitability, efficiency and cost-containment with greater ability than ever before.

2. Regulatory Compliance

But big data isn’t only a front office initiative. Regulatory compliance is a critical issue in the financial services world, and is changing rapidly. In an environment where 1) regulations change frequently 2) adherence is complicated by detailed requirements and 3) the penalties for violating regulations increasingly costly, there is greater and greater attention to this area of risk management.

In the regulatory arena, fines for non-compliance can add up to millions of dollars, increasing the desire for tools which protect against violations. As a result, investment in big data solutions which proactively defend the institution are seen as important strategic decisions.

Institutions can use big data to measure regulatory compliance by combining regulatory data with supporting documents, contracts, attestation and transactional information. By monitoring unstructured and unrelated content, including IM chats, emails, and telephone calls, and combining them algorithmically with trading activity data and documentation, compliance teams are forewarned of potential conflicts putting the trader and the organization at risk.

3. Fraud Management

A financial services organization gets exposed to different kinds of fraud that can result in millions of dollars of losses. Predictive analytics tools are used to build models to detect and prevent fraud. Data is correlated across different sources to identify fraudulent behavior across unrelated data points. While credit card fraud detection based on location, prior shopping history and expense patterns are known to many, these capabilities can also be used in asset management scenarios.

These solutions operate in real-time and utilize in-memory technologies to analyze hundreds of terabytes of data for a real-time transaction to detect fraud.

By correlating seemingly unrelated incidents to identify fraud with greater speed and accuracy, financial institutions are able to reduce exposure and reduce liability for fraudulent losses.

Conclusion

Big Data plays an increasingly important roll across the front, middle and back office of financial institutions. By taking advantage of previously unavailable insights, custodians are better able to both serve their customers while capitalizing on business operations and profitability.

To learn how big data can help your BFSI organization, contact SRI Infotech today.