Asset Manager PEAK6 Selects SunGard for a comprehensive Portfolio Management and risk system


Asset Manager PEAK6 Selects SunGard for a comprehensive Portfolio Management and risk system

http://www.derivsource.com/articles/asset-manager-peak6-selects-sungard-comprehensive-portfolio-management-and-risk-system

PEAK6 Advisors LLC (“PEAK6”), a Chicago-based asset manager specializing in alternative investments, has gone live with SunGard’s Front Arena and Monis solutions. The solutions help PEAK6 better service its clients by providing customized portfolio and risk management, valuations and trade processing.

SunGard’s Front Arena and Monis help increase operational efficiency by allowing PEAK6 to concentrate on what they do best – delivering returns and achieving agile growth.

“We wanted to ensure we are conducting ongoing due diligence by offering our investors a well-defined investment management system that can handle our assets under management in an efficient way. SunGard’s Front Arena and Monis help mitigate our internal risk and automate our validation processes – giving our investors piece of mind.” – Scott Kramer, chief technology officer, PEAK6

The implementation was completed in less than four months and PEAK6’s requirements were met in terms of open architecture and highly configurable functionalities.

Advertisements

Standardising TCA-article from FIXGlobal Magazine


http://fixglobal.com/content/standardising-tca

Mike Caffi, VP and Manager of Global TCA Services, State Street Global Advisors, and Mike Napper, Director and Head of Global Client Analytics Technology, Credit Suisse and EMEA Client Connectivity Technology, Co-Chairs of the FPL TCA Working Group, examine the motivations for, and the progress of the TCA reference guide.

What is the history of the FPL TCA project?
Mike Caffi: The industry has been lacking in any kind of standards for TCA, and that’s not a new problem. I have seen some really good TCA white papers over the last five to 10 years that have tried to address the subject, but eventually these fade on the shelf because there is no follow-up support or interest. What we’ve always needed was some independent group to be able to take ownership of this, but how does one get that started? It never really came about until about a year and a half ago, in September of 2011, with the formation of the OpenTCA group. The group was a collaboration of four sell-side firms in London and EMS and TCA vendor TradingScreen, which was the glue that put them all together.

When I read their white paper I got really excited because I saw a group of individuals who were trying to promote at least what appeared to be an essence of a global standard. They held a conference in London, and then they came to Boston and I contacted the person who was heading the public relations at TradingScreen. They invited me to be on a panel to talk about the benefits of standards, and that was in November of 2011. At this meeting TradingScreen had really tried to move this along, but I saw the need for a larger group to really take on this challenge as well. So that’s where I felt a group like FPL would be perfect.

As a matter of fact, when I was at the conference, I related this back to the early beginnings of the FIX Protocol when I was involved 15 years ago. We needed the collaboration of industry participants within a neutral body such as FPL, to take ownership of TCA standards. I even posed that question to the audience: “who would be willing to form a group, putting up a small amount of money just to get the essence of a working group together?”, but there really wasn’t much reaction at that particular time. Given that the holidays were approaching I decided to let it simmer down until after the New Year.

By early March I contacted John Goeller, FPL Americas Regional Co-Chair, and ran the idea by him to see if FPL would be interested in hosting a TCA Working Group. John liked the idea and soon after he ran it by the organisation’s Global Steering Committee, who also thought it would deliver strong industry benefit and liked the idea. From there, FPL leaders opened discussions with TradingScreen and due to strong FPL member firm interest in addressing some of the key business issues impacting the TCA environment, it was agreed that FPL would create a TCA working group. Representatives from Trading Screen joined this parallel activity.

So it was agreed to put out a call for participation, and on the first pass we had about 70 people sign up. We had our first meeting in June of last year, and that is when we got traction; at that point it was really pretty much driven by consensus, which evolved into a survey that allowed us to prioritise our objectives. That gave us greater focus and direction on what to do and, from that survey, we could see that with 70 or 80 people, we needed to break out into smaller groups.

The number one issue highlighted in the survey was terminology and methodology, and as such it was decided that our first working group should focus on coming up with standardised definitions for TCA in the equity space. This has taken some time as we wanted to take a slightly different approach, not writing a white paper, but more of a working reference guide. That project has been our focus for the last four or five months and right now we’re at a point where individuals are actually finalising the more difficult aspects of that document.

Last September, I said it’d be great if we can have this all done by the end of the year, and that was just a bit presumptuous on my part. Now I realise this is going to take quite a while to produce as there is a lot of work involved, and this is just the equity space. We’re going to look at multi-asset class perspectives of TCA after that.

Mike Napper, would you like to give a brief overview of the reasons behind your involvement?
Mike Napper: I’m interested and involved in this initiative from two perspectives. Firstly, I head Credit Suisse’s Global Transaction Cost Analysis Technology, both pre-trade and post-trade. Secondly, I also head FIX Client Connectivity for Credit Suisse in EMEA, for Equities, FX and Listed Derivatives, and thus have exposure to the FIX Protocol and FPL.

In 2Q2012, I was invited to help lead this initiative as the sell-side co-chair and I was very happy to contribute. The standardisation will help everyone in the market. It will help clients by providing more clarity on the reports they’re reading. It will help brokers and third party vendors by providing a consolidated reference guide explaining the principles and methodologies to all stakeholders. Firms are doing a lot of creative and original technical analysis, but there isn’t consensus in all cases on some of the basic stuff, and that’s an opportunity.

An area of particular interest, with both my TCA and FIX hats, is the convergence of asset classes onto electronic trading over time, providing greater automation and transparency. We can agree some foundational definitions for what TCA means in a multi-asset-class sense. We have started with a set of Equities definitions, to clarify and standardise what’s already out there and in most cases mature. Then, we’ll expand and mature the definitions across asset classes, where in some cases there is less existing consensus.

How are you unifying the different definitions that each market and regulator has, on areas such as the consolidated tape?
MN: The consolidated tape is an area where Europe has lagged. MiFID created the market fragmentation: this has been good for the industry’s efficiency and has promoted competition and innovation. Yet it left us with challenges on the market data side, notably with respect to TCA. We all spend time and effort re-assembling what we think is the best representation of a consolidated tape, from fragmented data, and using our own views of which venues and liquidity types to include. These varied approaches can cause confusion. We are not in this TCA process to solve the lack of a consolidated EMEA tape, other groups are working on that, but we can provide some guidance and standardisation on how best to handle the fragmentation.

MC: From the overall results of the survey, the consolidated tape was at the very bottom of the list. I think it’s not that it was the least concern on the survey, but rather the one of which we’d have the least control. From those discussions, we have found that being within the FPL group has had its benefits because we’ve now been able to liaise with other teams that are working more closely on those types of projects, such as the EMEA FPL Trade Data Standardisation Working Group.

Computing VWAP is one area within the European consolidated tape that we can provide some guidance on. That’s where standardisation of TCA can help. And as Mike said, what liquidity sources should be considered to be used or excluded. That’s where standards get tricky as you may need to structure VWAP in a certain way for your own internal reporting, so this becomes more of guidance than an absolute. Also part of what I see from the reference guide is to provide guidance on when it is best to use VWAP, and when it is best to use a point-in-time reference such as an arrival price or any other type of benchmark, and what are the results that you get and what the meaning behind that is.

I think we can help educate the recipients of TCA to better understand what it is they are receiving, or to know when to ask questions. The OpenTCA paper provided a number of excellent exhibits to help guide the selection of a provider and/or interpretation of 3rd party TCA. There are many approaches to TCA and the results can be misleading if you are not aware of the methodology that is being used.

What comes next, once it is finalised?
MC: My intent is to make this a living document that will adapt as the markets evolve over time; I don’t want it to be one of those things that gets put on the shelf and forgotten.

MN: We hope to think of it as building something like the FIX Protocol; it will continue to be refined by the participants and the users of it through consensus and the governance presented by the FPL process.

MC: I think there’s another aspect that came out of the survey results, and I think this is where my Co-chair has a lot of strengths, is that there is the TCA perspective, but there’s also going to be some overflow into the FIX Protocol space. I think that’s where Mike Napper has much more expertise than I do because it’s part of his day-to-day work and he understands the Protocol in terms of what we need to adjust. There have been some ideas for potential enhancements with the introduction of further standardised tags to the FIX Protocol to make TCA more robust within the FIX realm. Mike Napper will lead this work stream and look into the FIX related aspects of TCA, once we get beyond the basics.

MN: What tends to happen is that many instructional fields from clients to brokers may get defined on a custom per-broker/per-vendor basis. Thus we have a wide range of custom tags in operation. If we can get some consensus around, say, the 10 key pieces of information that you need to capture around each order, to provide a standard and robust TCA computation, then we can look to embed those as standard tags in the FIX Protocol. We’re still working on finalising the consensus around core TCA methodology and definitions, so we’re not yet looking at the implementation of FIX tags… but that will come.

Is that where you’d say your biggest challenge to date has been, with the definitions, the methodologies and deciding how to go forward? What is the timeline?
MN: Definitions and methodologies came out of the survey as the top concern. So we started there.

MC: The challenge right now is trying to get break-out group documents into a usable form, and in doing so I’ve already found some gaps that we left out or we need to rethink. This is going to set the tone for the other asset classes and beyond, so I’d rather spend the time now to get the first version right and then watch everything else fall in place instead of trying to rush it out the door. I don’t want to find that we have to redo the whole thing once we start trying to develop a fixed income document.

MN: And we’re also kicking off the Fixed Income TCA stream. With these different subgroups, we’re making sure we get participants who have expertise on the particular subject matter. So there will be some new faces coming in, with Fixed Income expertise, but we will be ensuring we apply the lessons learnt in equities to this space.

MC: Completion of the documentation is very important to us, but perhaps more important will be acceptance and adoption by the financial community. Beyond that, it is our vision that this work will find a permanent home within the standards community so that it can be maintained as needed to serve as the go-to source for TCA standards and best practices.

Analysing TCA, &the FIX Protocol Ltd-article from FIXGlobal Magazine


http://fixglobal.com/content/analysing-tca

Analysing TCA

 

Carlos Oliveira, Electronic Trading Solutions at Brandes Investment Partners examines the process of choosing a TCA provider, and the role of FPL.

We use Markit’s Execution Quality Manager (formerly known as QSG) for equity trading TCA. Our decision to switch providers was based on increased algorithm usage, a desire for more functionality, greater execution transparency and most importantly, the availability of more granular data for analysis via FIX.

We FTP our data daily and the results are available to us no later than US market open the next day. Trades are reviewed against traditional and custom benchmarks. We grant access to every trader and risk member, so that they can construct their own views as desired. Typically on a quarterly basis, we conduct our own and adapt broker studies to better understand the impact of our orders.

The implementation process
We evaluated four providers before making our final decision. We wanted a flexible platform that would accommodate maximum self-serving, custom reporting needs; minimal ongoing maintenance or upgrades requiring internal resources; and flexibility on custom solutions, such as the proper measurement of our ADR creation activity.

One vendor offered a very rich solution that was beyond our needs. For two others, we were not comfortable with the process for submitting data and how much work we would need to do internally. A key determinant was the overall level of commitment to the implementation, which we concluded Markit’s Managing Director Tim Sargent clearly demonstrated. It took us roughly two months to solidify the extract process and we went live on January 1, 2011.

TCA has become a key component of our trading process and we continue to realise value, primarily for post-trade at the moment. The value comes from the constant learning about our orders, what has worked well or not, and the adapting and improving of trading.

The large amount of data to analyse can be overwhelming at first and easily misinterpreted if not careful.

Frequent and honest dialog with the vendor, the traders, as well as tapping other sources of knowledge (i.e. broker TCA contacts and industry publications) is key to a successful implementation. Many reports went through several iterations, sometimes a quarter or two apart, before we got it to a meaningful and actionable state.

To avoid having too much of a one-side perspective, we compare broker-provided TCA reports with our vendor often. This helps the dialogue with both the brokers and the vendor – keeps both parties engaged and attentive.

The role of FPL
Our interaction with FPL began with the TCA implementation.

In late 2010, in conferences as well as in industry press, many parties were encouraging the buy-side to gain a better understanding of broker SOR practices and where the orders were getting executed, but with no actionable recommendations outside a specific platform. Being broker-neutral, the FIX execution venue reporting best practices proposed in early 2011 by the FPL Americas Buy-side Working Group helped us to move forward with this goal in the TCA platform. FPL Membership has enabled further contact with other buy-side firms and knowledge sharing not available otherwise to a smaller firm.

We started by asking for Tag 30, LastMarket. Broker responses to the data request varied greatly across brokers and regions. Correspondence spanned many months and contacts, particularly when we asked for MIC codes as opposed to proprietary values. We understand the queue priorities of brokers’ systems and demands of larger clients, and are very appreciative for what they have done thus far.

Some of our broker relationships have been exceptionally supportive in this effort, leading to enhanced dialogue on routing practices and more meaningful, targeted market structure content calls. Though not perfect, it is a significant improvement from just a year ago.

Ideally we would like to move forward and obtain data for Tag 851, but we are very much aware of the mapping challenges from exchanges to the brokers and to the OMS/EMS systems. We tabled this for 2012, but plan on revisiting it again in 2013.

What is next?
We are currently upgrading our OMS and exploring new functionality. Ultimately we hope for a richer dataset to enhance our capabilities, with some of the current TCA analytics embedded directly in the OMS and as close as possible to a real-time basis.

TCA for FX is also in the works, with a combination of in-house and broker solutions. This is now possible given timestamp collection improvements earlier in the year, but there are challenges still for obtaining data for benchmarking at a reasonable cost.

With greater attention to market structure and its impact to long term investors, the need for further transparency into the execution of orders by brokers, so as to understand our impact and performance will only continue to grow. For example, it would be great to know the sub-routing/destinations visited prior to getting a fill. The dialogue already in place between brokers, vendors and FPL working groups is a great step towards leveraging FIX for some level of data standardisation in the TCA arena that we hope will gain further traction in 2013.

 

Integrating TCA -article from FIXGlobal Magazine


Integrating TCA

 

Huw Gronow, Director, Equities Trading, and Mark Nebelung, Managing Director of Principal Global Investors, make the case that TCA should be part of pre-, during, and post-trade analysis.

Transaction Cost Analysis (TCA) has evolved significantly with the advent of technology in trading, and thus the ability to capture incrementally higher quality data. Historically the preserve of compliance departments was to examine explicit costs only as a way of governing portfolio turnover; this evolution provides institutional asset managers with several opportunities: the ability to quantitatively assess the value of the trading desk, the tools to form implementation strategies to improve prioritisation to reduce trading costs, and therefore improve alpha returns to portfolios.

Cost analysis models, methods and techniques have blossomed in the environment, propagated not only by technological advancements, but also in the explosion of data available in modern computerised equity trading.

The benefits of applying cost analysis to the execution function are manifold. It empowers the traders to make informed decisions on strategy choice, risk transfer, urgency of execution and ultimately to manage the optimisation of predicted market impact and opportunity costs.

Although maturing, the TCA industry still has some way to go to fully evolve, and that is largely a function of a characteristically dynamic market environment and non-standardised reporting of trades and market data (the so-called “consolidated tape” issue). Moreover, with the advent and increase in ultra-low latency high-frequency short term alpha market participants (“HFT”), which now account for the majority of trading activity in US exchanges and who dominate the market, the exponential increase in orders being withdrawn before execution (with ratios of cancelled to executed trades regularly as high as 75:1) means that there must be an implied effect on market impact which is as yet unquantified, yet empirically must be real. Finally, fragmentation of equity markets, both in the US and Europe, provide a real and new challenge in terms of true price discovery and this must also by extension be reflected in the post-trade arena.

Nevertheless, waiting for the imperfections and inefficiencies in market data to be ironed out (and they will surely be in time, whether by the industry or by regulatory intervention) means the opportunity to control trading costs is wasted. You cannot manage what you don’t measure. Therefore, with the practitioner’s understanding allied to sound analytical principles, it is very straightforward, while avoiding the usual statistical traps of unsound inferences and false positives/negatives, to progress from an anecdotal approach to a more evidence-based process very quickly.

On the trading desk, the ability to leap forward from being a clerical adjunct of the investment process to presenting empirical evidence of implementation cost control and therefore trading strategy enhancement is presented through this new avalanche of post trade data, which of course then becomes tomorrow’s pre-trade data. The benefit of being able to enrich one’s analysis through a systematic and consistent harvest of one’s own trading data through FIX tags is well documented. The head of trading then arrives at a straight choice: is this data and its analysis solely the preserve of the execution function, or can the investment process, as a whole, benefit from extending its usage? We aim to demonstrate that both execution and portfolio construction functions can reap significant dividends in terms of enhanced performance.

PM Involvement
Portfolio managers’ involvement in transaction cost analysis tends to be a post-trade affair at many firms, on a quarterly or perhaps monthly basis, that inspires about as much excitement as a trip to the dentist. It may be viewed as purely an execution or trading issue and independent of the investment decision making process. However, there is one key reason why portfolio managers should care about transaction costs: improved portfolio performance. The retort might be that this is the traders’ area of expertise coupled with a feeling of helplessness on how they could possibly factor transaction costs in. The answer lies in including pre-trade transaction costs estimates to adjust (reduce) your expected alpha signal with some reasonable estimate of implementation costs. You can now make investment decisions based on realisable expected alphas rather than purely theoretical ones.

A key characteristic of many investment processes that make some use of a quantitative alpha signal process is that you always have more stocks (on a stock count basis) in the small and micro-cap end of the investable universe. There are simply more stocks that rank well. This is also the same part of the universe where liquidity is the lowest and implementation shortfall is the highest. If you don’t properly penalise the alpha signals with some form of estimated transaction cost, your realized alpha can be more than eroded by the implementation costs.

Proving the Point
To illustrate the impact of including transaction cost estimates in the pre-trade portfolio construction decision making process, consider the following two simulations. Both are based on exactly the same starting portfolio, alpha signals and portfolio construction constraints. The only difference is that in the TCs Reflected simulation, transaction costs were included as a penalty to alpha in the optimisation objective function whereas in the TCs Ignored simulation, pre-trade transaction cost estimates were ignored. The simulations were for a Global Growth strategy using MSCI World Growth as the benchmark, running from January 1999 through the end of June 2012 (13.5 years) with weekly rebalancing. They were based on purely objective (quantitative) alpha signals and portfolio construction (optimisation) with no judgment overlay. Transaction cost estimates were based on ITG’s ACE Neutral transaction cost model. Starting AUM was $150 million. Post-transaction cost returns reflect the impact of the transaction cost estimates for each trade.

Despite relatively conservative assumptions relating to strategy size ($150 million poses relatively few liquidity constraints) and transaction cost model (ACE Neutral is a relatively passive cost model with lower cost estimates than a more aggressive trading strategy), the portfolio reflecting transaction costs as part of the pre-trade portfolio construction outperformed the one where they weren’t by 0.86% per annum. Figure 1 illustrates the cumulative growth of $1 between the two portfolios.

At the end of the time period, the TCs Reflected portfolio had grown to $2.94 vs. $2.63 for the TCs Ignored portfolio, an additional 30% return on initial capital. The turnover of the TCs Reflected portfolio was modestly higher, averaging 69% p.a., compared to 67% p.a. for the TCs Ignored portfolio.

Annualised transaction costs for the TCs Reflected portfolio was slightly higher at 0.64% vs. 0.62% for the TCs Ignored portfolio. Tracking error and volatility of the two portfolios is very similar. The net effect of higher excess returns (after transaction costs) and similar risk profile (tracking error) was a 34% improvement in the information ratio when transaction costs were reflected as part of the portfolio construction.

It’s hard to think of many (any?) portfolio managers that wouldn’t seize an opportunity to add an additional 0.86% per annum in excess return. Transaction cost estimates will materially alter the most attractive stocks to add to a portfolio at a given point in time and the cumulative impact on performance is significant. In order to maximise realised portfolio performance, portfolio managers need to reflect some form of implementation cost-adjusted alpha signals such that the expected returns of illiquid stocks are appropriately adjusted for expected costs of buying or selling them in current market conditions.

In addition to portfolio performance improvements, portfolio managers considering pre-trade implementation cost estimates have a better basis to judge whether to reconsider a transaction if current market implementation costs are deviating significantly from the initial estimates. By having a common understanding of implementation costs between both portfolio managers and traders, communication is enhanced pre-, during and post-trade. Where the trading function was previously simply a transaction execution function, it now becomes part of the integrated investment decision making process.

List of TMX ATRIUM TRADING VENUES


TMX Atrium has a wide range of customers including venues, buy side, brokers, clearers, ISVs, market data vendors.

TMX Atrium covers a wide range of the financial community.

Venue City Country
Alpha Trading Toronto Canada
BATS Europe London UK
BATS US Weehwken USA
BME Madrid Spain
BOX Secaucus USA
CBOE Secaucus. USA
CNSX Toronto Canada
Borse de Luxembourg Luxembourg Luxembourg
Burgundy. Stockholm Sweden
CHI-X Canada Toronto Canada
CHI-X Europe Slough UK
CME Chicago USA
Deutsche Boerse Frankfurt Germany
Direct Edge Secaucus USA
Equiduct London UK
FX All Weehwken USA
FXCM Bergen USA
HotSpot Jersey City USA
International Sec Exchange New York USA
LMAX London UK
London Metal Exchange London UK
Match Now Toronto Canada
Montreal Exchange Toronto Canada
Moscow Exchange Moscow Russia
NASDAQ OMX (Nordic) Stockholm Sweden
NASDAQ OMX (US) Carteret USA
Oslo Bors London UK
Nordic Growth Markets Stockholm Sweden
NYSE Euronext (Europe) Basildon UK
NYSE Euronext (US) Mahwah USA
Omega ATS Toronto Canada
Pure Trading Toronto Canada
Sigma-X London UK
TOM Stockholm Sweden
TRAD-X London UK
TSX Toronto Canada
Warsaw Stock Exchange Warsaw Poland

TESS Connect & Go overview including Customers list


http://www.cinnober.com/tess-connect-go

TESS™ Connect & Go is a fully managed, multi-asset marketplace service for trading intensive banks and brokers that will give instant access to proven cutting-edge exchange technology. The service enables banks and brokers to set up a regulatory compliant market within weeks. TESS is a Software-as-a-Service (SaaS) solution, suitable for building and provisioning trading venues such as OTFs, SIs, SEFs and dark pools.

A full-service concept

TESS Connect & Go gives banks and brokers a unique opportunity to access state-of-the-art marketplace technology in a full-service concept. For the TESS customer this means a low-risk investment, superior total cost of ownership and ability to focus fully on core business.

TESS is configured and ready for production within weeks from order. There is no startup cost and the monthly subscription fee covers software, hardware, maintenance, hosting, operations, infrastructure and support. The service is provided from fully redundant ISO-certified data centers globally with 24/7 support and dedicated account management.

Rich and proven marketplace functionality

The core of TESS Connect & Go is the sophisticated multi-asset matching engine used in demanding equities, commodities and derivatives markets with proven speed, performance and robustness.

It can be applied to manage multiple trading models in parallel for liquid as well as illiquid markets including auctions, continuous trading, request for quote (RFQ), dark pool functionality, midpoint matching and OTC trade reporting.

The service can be extended to also include the full-fledged market surveillance system Scila Surveillance. 

Solutions for trading and clearing venues

Product-based solutions that change the world of finance

All Cinnober solutions are based on our TRADExpress™ Platform, built to cater the needs of high-transaction marketplaces and clearing houses with extreme demands on speed, performance and reliability.

TRADExpress Platform

Includes all the infrastructural components needed for true mission-critical transaction solutions

Versatility by default

All Cinnober solutions are based on our TRADExpress™ Platform, built to cater the needs of high-transaction

Managed services

In the financial sector, a strong IT partner needs to deliver more than just robust technology. It should help ensure a smooth launch, implementation and operation – as well as provide a flexible path for post-launch developments, since the market never stops changing.

While some customers might have firmly established system operations, their IT departments might already be fully burdened and unable to take on new projects. New marketplaces may start out without an IT department at all, and with very few resources in place. Cinnober therefore offers complete system hosting and operational services, from system dimensioning, through installation, to ongoing operation.

All Cinnober systems can be ordered as fully managed solutions including hosting in ISO-certified data centers, management of infrastructure and hardware, system operations including monitoring, surveillance, backup, system reports and issue management. All clients are backed up by a dedicated Technical Account Manager and the Cinnober Service Desk

Customers

To write, or not to write? – The Dilemma for ISVs and their role in the success of a new trading platform,


http://www.fow.com/Article/3154709/Themes/26528/To-write-or-not-to-write.html

To write, or not to write?

12 February 2013

Nasdaq’s new trading platform NLX is gearing up for launch in London. Sentiment is shifting in favour of the prospects for the MTF. This highlights the dilemma for ISVs and their role in the success of a new trading platform, argues William Mitting.

Six months ago you would have struggled to find anyone in London who thought that NLX, the new London-based exchange from Nasdaq OMX, would succeed.

The prevailing wisdom was the plan to launch six interest rate contracts replicating the most liquid on Liffe and Eurex was too simplistic, the margin efficiencies intangible and the distraction of regulation and rising costs elsewhere too great to guarantee the involvement from the banks and prop trading firms that it needs for a successful launch.

Today all the talk in London is of NLX. After six months of painstaking road-showing and collaboration with local participants by Charlotte Crosswell and her team at NLX there is a real buzz about the launch around the City.

Much of that buzz is coming from the proprietary trading houses. Attracted by the lower fees, the lower participation from HFTs expected on the platform and the belief that the banks, who are expected to benefit from the margin efficiency enabled by portfolio margining across the yield curve, will provide liquidity, London’s largest prop houses are increasingly talking up the prospects of the MTF.

This rapid shift in sentiment poses a challenge for those ISVs who made the call not to write to the MTF for its launch. The highest profile among those ISVs is Trading Technologies, which is not expected to be ready for the launch of NLX.

FOW understands that TT has been in negotiations with NLX over writing to it but has not yet reached agreement on how that will be funded. TT and NLX declined to comment on any negotiations. Initially this was widely viewed as a significant blow to NLX’s chances, but as the buzz around the platform grows, some are asking if TT has made a mistake.

Jeff Mezger, head of market connectivity, told FOW that TT had “not ruled out connecting to NLX” at its launch and see the benefits of margin efficiency but was currently focused in-flight projects such as the connection to Eris Exchange and its beta stage MultiBroker platform.

“We take into account a number of factors when prioritising the projects we work on. For exchange projects we take into account exchange location, familiarity with the exchange platform, connectivity costs, client interest, exchange volume, asset classes, products traded and the changing regulatory environment.

“We also take into account what other projects we have in flight and the availability of resources to work on the project.”

This dilemma of whether to connect to a new trading platform is a relatively new phenomenon in derivatives markets but will become more of an issue as new platforms launch in the wake of industry efforts and new regulations aimed at opening up competition.

All ISVs have limited manpower and resources to write to new platforms and the decision of whether to do so is often made with little visibility as to whether that platform will succeed.

Usually one or a number of customers will help to fund the connection, sometimes the platform or exchange will pay the majority of the cost and for “dead certs” the ISV will fund it in the knowledge that it will see a return on investment. However, what constitutes a dead cert is becoming less clear as markets proliferate.

Steve Grob, director of group strategy at Fidessa, said: “The whole dynamic of ISVs connecting to venues has changed since Mifid was introduced back in 2007. Volumes that were concentrated in two or three exchanges were spread over multiple platforms.”

This altered the economics of connectivity as it meant that brokers were having to spread the same volumes over multiple venues and this inevitably led to downward pressure on gateway pricing. At the same time, the number of platforms launched with uncertain prospects is increasing.

When Liffe launched its Connect platform at the turn of the century, many ISVs wrote to it and made a decent return doing so. As the derivatives market becomes more fragmented thanks to Dodd Frank and EMIR, it is harder to predict which platforms are worth the investment.

“The challenge is picking the markets that have the best chance of success,” says Steve Woodyatt, chief executive of Object Trading, which will connect to NLX on day one.

Hamish Purdey, the chief executive of Ffastfill, which is also going live from day one, agrees: “It takes significant commercial judgement. Any ISV has competing priorities and the challenge is finding the ones with the greatest return on investment.”

For fledgling exchanges, a major ISV writing to it can provide a significant boost and it is not unheard of for exchanges to pay large sums to global ISVs to write to them. However, this is rare and in most instances ISVs must make a call on the commercial benefits of connecting to a new exchange.

The fact that the cost of switching to a new provider can be high means that if a large ISV does not write to an exchange, its customers, if unwilling to fund the connection themselves, are often left with no options and the decision not to connect can be contentious.

However, as competition grows in terms of connectivity providers and software-as-a-service operations makes the process of switching provider less arduous, a shift in sentiment in terms of the market’s perception of the need to connect to an exchange can potentially wrong foot ISVs.

RTS has announced publically that it will write to NLX from launch and FOW understands that Fidessa, Sungard, Object Trading, Stellar and Orc are also among those ISVs providing day one access.

TT and Marex’s STS are among the notable absences from day one trading (although FOW understands STS will be up and running shortly afterwards) but there is still some distance to run and TT could still commit before the launch date, which is expected for early Q2. However, the NLX example has brought to the fore a very modern dilemma for ISVs in the derivatives business.

Panopticon releases version 6.1.1, great news for finance TCA.


Panopticon releases version 6.1.1, great news for finance TCA.

http://www.automatedtrader.net/news/complex-event-processing-news/141992/panopticon-releases-version-611

Stockholm, Sweden – Panopticon Software, the provider of visual data analysis software for real-time, CEP and historical time series data, has released Panopticon 6.1.1, the latest version of its data visualization software suite. As with previous releases, Panopticon 6.1.1 supports Windows and Java IT environments for enterprise deployment and also allows clients to embed visual analysis-enabled Panopticon dashboards into their own applications.

Panopticon incorporates the ability to visualize data from true real-time streaming sources, including message queues, CEP engines, real time databases, and tick history stores as well as federate data from multiple sources, including standard relational databases, OData producers, other web services, and flat files.

Panopticon 6.1.1 features include:

  • New Timeseries Scatter Plot Visualization
  • New data connector for the 60 East AMPS message bus
  • Resampling of tick data sources through zooming
  • Enhanced Heat Matrix tile to support multiple display fields
  • New Time Series Product calculations
  • Support for Apache ActiveMQ message buses, OneMarketData OneTick tick databases and Kx kdb+tick CEP engines in Java environments.

Panopticon 6.1.1 supports the following deployment options:

  • Enterprise: The Panopticon 6 platform includes a desktop authoring tool that allows power users to assemble and publish new monitoring and analysis dashboards to the web.
  • Tablet: The same dashboards delivered to desktop computers are available on tablets. Panopticon’s HTML5 user interface is specifically designed to take advantage of the touch capabilities of the iPad and Android devices.
  • Embed: The Panopticon Embed tool allows users to build dashboards quickly using Panopticon’s point-and-click software and then embed the complete dashboards as objects into your applications. This enables the completion of development projects with minimal coding.

Peter Simpson, SVP Research & Development for Panopticon, stated, “The enhancements in 6.1.1 include several that were developed in close cooperation with our major clients. They further enhance our capabilities in mid-office market/credit risk and P&L applications as well as front office TCA, including real time TCA, end of day analysis or review of historic transaction data.”

OTC Clearing and Regulations

My views on OTCs and Regulations

Knowledge Problem

Commentary on Economics, Information and Human Action

northoceanscm

4 out of 5 dentists recommend this WordPress.com site

Aditya Ladia's Blog

Forex, Investment and Finance

Soltis Consulting, Inc.

Business Concepts, Ideas, and Information

intradaynifty

An Endeavor To Explore The Uncertanity

All About Cyber Security and Financial Technology and Beyond

The Future of FinTech and Cybersecurity are Interlocked: Creating the Secure Future of Financial Technology Today

symphony

Read about latest trends in Algo Trading in India or visit our website symphonyfintech.com

Trading Smarter

Thought Leadership, Insight and Product Information from TradingScreen

Tales from a Trading Desk

Noise from an Investment Bank

The Main Street Analyst

New York City Magazine! Marketing, Social Media, Business - Connecting The Dots!

NPA Computers

Bringing you info about the latest on Internet Technology

duanetilden

The Latest News on International Energy Trends, Green Building, Sustainability, A&E and Social Media

Trade News in Brief

International Economic Affairs & Relations / Regional & International Organizations / Global Commerce & Business

Letters from Nopeville

Nothing to do here

SingleDealerPlatforms.Org

The Single Dealer Platform Community

Carl A R Weir's Blog

A Cross Asset Connectivity and Finance WordPress.com site

%d bloggers like this: