Continuum Analytics Provides Python Insight into Thomson Reuters Databases

Continuum Analytics is providing access to Thomson Reuters’ QADirect data offering via Wakari, its browser-based Python data analytics platform. Python is an open source computer language that was first released in 1991, and which has grown in popularity in part because of its emphasis on code readability and compactness. A number of financial markets participants are known to be using or evaluating it.

QADirect comprises data spread across a number of databases, including Worldscope, Datastream and I/B/E/S, and is focused on the needs of quantitative research. It provides access to normalised data from different sources via a single identifer. It is typically deployed on a customer site with daily updates, and is hosted in a relational database, such as from Microsoft or Oracle. Access is via SQL or an analytics package, including SAS, S-Plus, Matlab or R, or via Thomson Reuters own QA Studio analytics interface.

Austin, Texas-based Continuum specialises in Python-related technology, for data management and analytics. Founded in 2011, Continuum is privately backed and earlier this year received $3 million if funding from the Defense Advanced Research Projects Agency (DARPA) to develop visualisation techniques for large, multi-dimensional datasets. It released Wakari into beta in December 2012, providing an easy mechanism for developers to create data analytics using Python. Wakari includes IOPro, a library developed specifically to optimise access to large datasets, such as QADirect.

Says Continuum president Peter Wang, Python provides a simpler data access method than traditional SQL, reckoning that Python can do in one line of code what SQL takes five or six to acheive. It is also an easier programming language to read, compared to the likes of C, C++ and Java, he contends.

Wakari can either be installed within an enterprise or accessed as a managed service, in which case it relies on cloud EC2 (compute) and S3 (storage) infrastructure from Amazon Web Services. It also can leverage GPU-accelerated compute nodes. One of the strengths of Wakari, says Wang, is the ease of sharing analytics results, which can be acheived by sharing a single URL.

Python has clearly caught the interest of financial markets developers. More than 300 attended a recent New York Python in Finance conference, hosted by Bank of America Merrill Lynch, which featured speakers from Continuum, and addressed topics including visualisation, scalability, GPU support, and integration with the likes of R and Microsoft Excel.

Standardising TCA-article from FIXGlobal Magazine

Mike Caffi, VP and Manager of Global TCA Services, State Street Global Advisors, and Mike Napper, Director and Head of Global Client Analytics Technology, Credit Suisse and EMEA Client Connectivity Technology, Co-Chairs of the FPL TCA Working Group, examine the motivations for, and the progress of the TCA reference guide.

What is the history of the FPL TCA project?
Mike Caffi: The industry has been lacking in any kind of standards for TCA, and that’s not a new problem. I have seen some really good TCA white papers over the last five to 10 years that have tried to address the subject, but eventually these fade on the shelf because there is no follow-up support or interest. What we’ve always needed was some independent group to be able to take ownership of this, but how does one get that started? It never really came about until about a year and a half ago, in September of 2011, with the formation of the OpenTCA group. The group was a collaboration of four sell-side firms in London and EMS and TCA vendor TradingScreen, which was the glue that put them all together.

When I read their white paper I got really excited because I saw a group of individuals who were trying to promote at least what appeared to be an essence of a global standard. They held a conference in London, and then they came to Boston and I contacted the person who was heading the public relations at TradingScreen. They invited me to be on a panel to talk about the benefits of standards, and that was in November of 2011. At this meeting TradingScreen had really tried to move this along, but I saw the need for a larger group to really take on this challenge as well. So that’s where I felt a group like FPL would be perfect.

As a matter of fact, when I was at the conference, I related this back to the early beginnings of the FIX Protocol when I was involved 15 years ago. We needed the collaboration of industry participants within a neutral body such as FPL, to take ownership of TCA standards. I even posed that question to the audience: “who would be willing to form a group, putting up a small amount of money just to get the essence of a working group together?”, but there really wasn’t much reaction at that particular time. Given that the holidays were approaching I decided to let it simmer down until after the New Year.

By early March I contacted John Goeller, FPL Americas Regional Co-Chair, and ran the idea by him to see if FPL would be interested in hosting a TCA Working Group. John liked the idea and soon after he ran it by the organisation’s Global Steering Committee, who also thought it would deliver strong industry benefit and liked the idea. From there, FPL leaders opened discussions with TradingScreen and due to strong FPL member firm interest in addressing some of the key business issues impacting the TCA environment, it was agreed that FPL would create a TCA working group. Representatives from Trading Screen joined this parallel activity.

So it was agreed to put out a call for participation, and on the first pass we had about 70 people sign up. We had our first meeting in June of last year, and that is when we got traction; at that point it was really pretty much driven by consensus, which evolved into a survey that allowed us to prioritise our objectives. That gave us greater focus and direction on what to do and, from that survey, we could see that with 70 or 80 people, we needed to break out into smaller groups.

The number one issue highlighted in the survey was terminology and methodology, and as such it was decided that our first working group should focus on coming up with standardised definitions for TCA in the equity space. This has taken some time as we wanted to take a slightly different approach, not writing a white paper, but more of a working reference guide. That project has been our focus for the last four or five months and right now we’re at a point where individuals are actually finalising the more difficult aspects of that document.

Last September, I said it’d be great if we can have this all done by the end of the year, and that was just a bit presumptuous on my part. Now I realise this is going to take quite a while to produce as there is a lot of work involved, and this is just the equity space. We’re going to look at multi-asset class perspectives of TCA after that.

Mike Napper, would you like to give a brief overview of the reasons behind your involvement?
Mike Napper: I’m interested and involved in this initiative from two perspectives. Firstly, I head Credit Suisse’s Global Transaction Cost Analysis Technology, both pre-trade and post-trade. Secondly, I also head FIX Client Connectivity for Credit Suisse in EMEA, for Equities, FX and Listed Derivatives, and thus have exposure to the FIX Protocol and FPL.

In 2Q2012, I was invited to help lead this initiative as the sell-side co-chair and I was very happy to contribute. The standardisation will help everyone in the market. It will help clients by providing more clarity on the reports they’re reading. It will help brokers and third party vendors by providing a consolidated reference guide explaining the principles and methodologies to all stakeholders. Firms are doing a lot of creative and original technical analysis, but there isn’t consensus in all cases on some of the basic stuff, and that’s an opportunity.

An area of particular interest, with both my TCA and FIX hats, is the convergence of asset classes onto electronic trading over time, providing greater automation and transparency. We can agree some foundational definitions for what TCA means in a multi-asset-class sense. We have started with a set of Equities definitions, to clarify and standardise what’s already out there and in most cases mature. Then, we’ll expand and mature the definitions across asset classes, where in some cases there is less existing consensus.

How are you unifying the different definitions that each market and regulator has, on areas such as the consolidated tape?
MN: The consolidated tape is an area where Europe has lagged. MiFID created the market fragmentation: this has been good for the industry’s efficiency and has promoted competition and innovation. Yet it left us with challenges on the market data side, notably with respect to TCA. We all spend time and effort re-assembling what we think is the best representation of a consolidated tape, from fragmented data, and using our own views of which venues and liquidity types to include. These varied approaches can cause confusion. We are not in this TCA process to solve the lack of a consolidated EMEA tape, other groups are working on that, but we can provide some guidance and standardisation on how best to handle the fragmentation.

MC: From the overall results of the survey, the consolidated tape was at the very bottom of the list. I think it’s not that it was the least concern on the survey, but rather the one of which we’d have the least control. From those discussions, we have found that being within the FPL group has had its benefits because we’ve now been able to liaise with other teams that are working more closely on those types of projects, such as the EMEA FPL Trade Data Standardisation Working Group.

Computing VWAP is one area within the European consolidated tape that we can provide some guidance on. That’s where standardisation of TCA can help. And as Mike said, what liquidity sources should be considered to be used or excluded. That’s where standards get tricky as you may need to structure VWAP in a certain way for your own internal reporting, so this becomes more of guidance than an absolute. Also part of what I see from the reference guide is to provide guidance on when it is best to use VWAP, and when it is best to use a point-in-time reference such as an arrival price or any other type of benchmark, and what are the results that you get and what the meaning behind that is.

I think we can help educate the recipients of TCA to better understand what it is they are receiving, or to know when to ask questions. The OpenTCA paper provided a number of excellent exhibits to help guide the selection of a provider and/or interpretation of 3rd party TCA. There are many approaches to TCA and the results can be misleading if you are not aware of the methodology that is being used.

What comes next, once it is finalised?
MC: My intent is to make this a living document that will adapt as the markets evolve over time; I don’t want it to be one of those things that gets put on the shelf and forgotten.

MN: We hope to think of it as building something like the FIX Protocol; it will continue to be refined by the participants and the users of it through consensus and the governance presented by the FPL process.

MC: I think there’s another aspect that came out of the survey results, and I think this is where my Co-chair has a lot of strengths, is that there is the TCA perspective, but there’s also going to be some overflow into the FIX Protocol space. I think that’s where Mike Napper has much more expertise than I do because it’s part of his day-to-day work and he understands the Protocol in terms of what we need to adjust. There have been some ideas for potential enhancements with the introduction of further standardised tags to the FIX Protocol to make TCA more robust within the FIX realm. Mike Napper will lead this work stream and look into the FIX related aspects of TCA, once we get beyond the basics.

MN: What tends to happen is that many instructional fields from clients to brokers may get defined on a custom per-broker/per-vendor basis. Thus we have a wide range of custom tags in operation. If we can get some consensus around, say, the 10 key pieces of information that you need to capture around each order, to provide a standard and robust TCA computation, then we can look to embed those as standard tags in the FIX Protocol. We’re still working on finalising the consensus around core TCA methodology and definitions, so we’re not yet looking at the implementation of FIX tags… but that will come.

Is that where you’d say your biggest challenge to date has been, with the definitions, the methodologies and deciding how to go forward? What is the timeline?
MN: Definitions and methodologies came out of the survey as the top concern. So we started there.

MC: The challenge right now is trying to get break-out group documents into a usable form, and in doing so I’ve already found some gaps that we left out or we need to rethink. This is going to set the tone for the other asset classes and beyond, so I’d rather spend the time now to get the first version right and then watch everything else fall in place instead of trying to rush it out the door. I don’t want to find that we have to redo the whole thing once we start trying to develop a fixed income document.

MN: And we’re also kicking off the Fixed Income TCA stream. With these different subgroups, we’re making sure we get participants who have expertise on the particular subject matter. So there will be some new faces coming in, with Fixed Income expertise, but we will be ensuring we apply the lessons learnt in equities to this space.

MC: Completion of the documentation is very important to us, but perhaps more important will be acceptance and adoption by the financial community. Beyond that, it is our vision that this work will find a permanent home within the standards community so that it can be maintained as needed to serve as the go-to source for TCA standards and best practices.

Analysing TCA, &the FIX Protocol Ltd-article from FIXGlobal Magazine

Analysing TCA


Carlos Oliveira, Electronic Trading Solutions at Brandes Investment Partners examines the process of choosing a TCA provider, and the role of FPL.

We use Markit’s Execution Quality Manager (formerly known as QSG) for equity trading TCA. Our decision to switch providers was based on increased algorithm usage, a desire for more functionality, greater execution transparency and most importantly, the availability of more granular data for analysis via FIX.

We FTP our data daily and the results are available to us no later than US market open the next day. Trades are reviewed against traditional and custom benchmarks. We grant access to every trader and risk member, so that they can construct their own views as desired. Typically on a quarterly basis, we conduct our own and adapt broker studies to better understand the impact of our orders.

The implementation process
We evaluated four providers before making our final decision. We wanted a flexible platform that would accommodate maximum self-serving, custom reporting needs; minimal ongoing maintenance or upgrades requiring internal resources; and flexibility on custom solutions, such as the proper measurement of our ADR creation activity.

One vendor offered a very rich solution that was beyond our needs. For two others, we were not comfortable with the process for submitting data and how much work we would need to do internally. A key determinant was the overall level of commitment to the implementation, which we concluded Markit’s Managing Director Tim Sargent clearly demonstrated. It took us roughly two months to solidify the extract process and we went live on January 1, 2011.

TCA has become a key component of our trading process and we continue to realise value, primarily for post-trade at the moment. The value comes from the constant learning about our orders, what has worked well or not, and the adapting and improving of trading.

The large amount of data to analyse can be overwhelming at first and easily misinterpreted if not careful.

Frequent and honest dialog with the vendor, the traders, as well as tapping other sources of knowledge (i.e. broker TCA contacts and industry publications) is key to a successful implementation. Many reports went through several iterations, sometimes a quarter or two apart, before we got it to a meaningful and actionable state.

To avoid having too much of a one-side perspective, we compare broker-provided TCA reports with our vendor often. This helps the dialogue with both the brokers and the vendor – keeps both parties engaged and attentive.

The role of FPL
Our interaction with FPL began with the TCA implementation.

In late 2010, in conferences as well as in industry press, many parties were encouraging the buy-side to gain a better understanding of broker SOR practices and where the orders were getting executed, but with no actionable recommendations outside a specific platform. Being broker-neutral, the FIX execution venue reporting best practices proposed in early 2011 by the FPL Americas Buy-side Working Group helped us to move forward with this goal in the TCA platform. FPL Membership has enabled further contact with other buy-side firms and knowledge sharing not available otherwise to a smaller firm.

We started by asking for Tag 30, LastMarket. Broker responses to the data request varied greatly across brokers and regions. Correspondence spanned many months and contacts, particularly when we asked for MIC codes as opposed to proprietary values. We understand the queue priorities of brokers’ systems and demands of larger clients, and are very appreciative for what they have done thus far.

Some of our broker relationships have been exceptionally supportive in this effort, leading to enhanced dialogue on routing practices and more meaningful, targeted market structure content calls. Though not perfect, it is a significant improvement from just a year ago.

Ideally we would like to move forward and obtain data for Tag 851, but we are very much aware of the mapping challenges from exchanges to the brokers and to the OMS/EMS systems. We tabled this for 2012, but plan on revisiting it again in 2013.

What is next?
We are currently upgrading our OMS and exploring new functionality. Ultimately we hope for a richer dataset to enhance our capabilities, with some of the current TCA analytics embedded directly in the OMS and as close as possible to a real-time basis.

TCA for FX is also in the works, with a combination of in-house and broker solutions. This is now possible given timestamp collection improvements earlier in the year, but there are challenges still for obtaining data for benchmarking at a reasonable cost.

With greater attention to market structure and its impact to long term investors, the need for further transparency into the execution of orders by brokers, so as to understand our impact and performance will only continue to grow. For example, it would be great to know the sub-routing/destinations visited prior to getting a fill. The dialogue already in place between brokers, vendors and FPL working groups is a great step towards leveraging FIX for some level of data standardisation in the TCA arena that we hope will gain further traction in 2013.


Integrating TCA -article from FIXGlobal Magazine

Integrating TCA


Huw Gronow, Director, Equities Trading, and Mark Nebelung, Managing Director of Principal Global Investors, make the case that TCA should be part of pre-, during, and post-trade analysis.

Transaction Cost Analysis (TCA) has evolved significantly with the advent of technology in trading, and thus the ability to capture incrementally higher quality data. Historically the preserve of compliance departments was to examine explicit costs only as a way of governing portfolio turnover; this evolution provides institutional asset managers with several opportunities: the ability to quantitatively assess the value of the trading desk, the tools to form implementation strategies to improve prioritisation to reduce trading costs, and therefore improve alpha returns to portfolios.

Cost analysis models, methods and techniques have blossomed in the environment, propagated not only by technological advancements, but also in the explosion of data available in modern computerised equity trading.

The benefits of applying cost analysis to the execution function are manifold. It empowers the traders to make informed decisions on strategy choice, risk transfer, urgency of execution and ultimately to manage the optimisation of predicted market impact and opportunity costs.

Although maturing, the TCA industry still has some way to go to fully evolve, and that is largely a function of a characteristically dynamic market environment and non-standardised reporting of trades and market data (the so-called “consolidated tape” issue). Moreover, with the advent and increase in ultra-low latency high-frequency short term alpha market participants (“HFT”), which now account for the majority of trading activity in US exchanges and who dominate the market, the exponential increase in orders being withdrawn before execution (with ratios of cancelled to executed trades regularly as high as 75:1) means that there must be an implied effect on market impact which is as yet unquantified, yet empirically must be real. Finally, fragmentation of equity markets, both in the US and Europe, provide a real and new challenge in terms of true price discovery and this must also by extension be reflected in the post-trade arena.

Nevertheless, waiting for the imperfections and inefficiencies in market data to be ironed out (and they will surely be in time, whether by the industry or by regulatory intervention) means the opportunity to control trading costs is wasted. You cannot manage what you don’t measure. Therefore, with the practitioner’s understanding allied to sound analytical principles, it is very straightforward, while avoiding the usual statistical traps of unsound inferences and false positives/negatives, to progress from an anecdotal approach to a more evidence-based process very quickly.

On the trading desk, the ability to leap forward from being a clerical adjunct of the investment process to presenting empirical evidence of implementation cost control and therefore trading strategy enhancement is presented through this new avalanche of post trade data, which of course then becomes tomorrow’s pre-trade data. The benefit of being able to enrich one’s analysis through a systematic and consistent harvest of one’s own trading data through FIX tags is well documented. The head of trading then arrives at a straight choice: is this data and its analysis solely the preserve of the execution function, or can the investment process, as a whole, benefit from extending its usage? We aim to demonstrate that both execution and portfolio construction functions can reap significant dividends in terms of enhanced performance.

PM Involvement
Portfolio managers’ involvement in transaction cost analysis tends to be a post-trade affair at many firms, on a quarterly or perhaps monthly basis, that inspires about as much excitement as a trip to the dentist. It may be viewed as purely an execution or trading issue and independent of the investment decision making process. However, there is one key reason why portfolio managers should care about transaction costs: improved portfolio performance. The retort might be that this is the traders’ area of expertise coupled with a feeling of helplessness on how they could possibly factor transaction costs in. The answer lies in including pre-trade transaction costs estimates to adjust (reduce) your expected alpha signal with some reasonable estimate of implementation costs. You can now make investment decisions based on realisable expected alphas rather than purely theoretical ones.

A key characteristic of many investment processes that make some use of a quantitative alpha signal process is that you always have more stocks (on a stock count basis) in the small and micro-cap end of the investable universe. There are simply more stocks that rank well. This is also the same part of the universe where liquidity is the lowest and implementation shortfall is the highest. If you don’t properly penalise the alpha signals with some form of estimated transaction cost, your realized alpha can be more than eroded by the implementation costs.

Proving the Point
To illustrate the impact of including transaction cost estimates in the pre-trade portfolio construction decision making process, consider the following two simulations. Both are based on exactly the same starting portfolio, alpha signals and portfolio construction constraints. The only difference is that in the TCs Reflected simulation, transaction costs were included as a penalty to alpha in the optimisation objective function whereas in the TCs Ignored simulation, pre-trade transaction cost estimates were ignored. The simulations were for a Global Growth strategy using MSCI World Growth as the benchmark, running from January 1999 through the end of June 2012 (13.5 years) with weekly rebalancing. They were based on purely objective (quantitative) alpha signals and portfolio construction (optimisation) with no judgment overlay. Transaction cost estimates were based on ITG’s ACE Neutral transaction cost model. Starting AUM was $150 million. Post-transaction cost returns reflect the impact of the transaction cost estimates for each trade.

Despite relatively conservative assumptions relating to strategy size ($150 million poses relatively few liquidity constraints) and transaction cost model (ACE Neutral is a relatively passive cost model with lower cost estimates than a more aggressive trading strategy), the portfolio reflecting transaction costs as part of the pre-trade portfolio construction outperformed the one where they weren’t by 0.86% per annum. Figure 1 illustrates the cumulative growth of $1 between the two portfolios.

At the end of the time period, the TCs Reflected portfolio had grown to $2.94 vs. $2.63 for the TCs Ignored portfolio, an additional 30% return on initial capital. The turnover of the TCs Reflected portfolio was modestly higher, averaging 69% p.a., compared to 67% p.a. for the TCs Ignored portfolio.

Annualised transaction costs for the TCs Reflected portfolio was slightly higher at 0.64% vs. 0.62% for the TCs Ignored portfolio. Tracking error and volatility of the two portfolios is very similar. The net effect of higher excess returns (after transaction costs) and similar risk profile (tracking error) was a 34% improvement in the information ratio when transaction costs were reflected as part of the portfolio construction.

It’s hard to think of many (any?) portfolio managers that wouldn’t seize an opportunity to add an additional 0.86% per annum in excess return. Transaction cost estimates will materially alter the most attractive stocks to add to a portfolio at a given point in time and the cumulative impact on performance is significant. In order to maximise realised portfolio performance, portfolio managers need to reflect some form of implementation cost-adjusted alpha signals such that the expected returns of illiquid stocks are appropriately adjusted for expected costs of buying or selling them in current market conditions.

In addition to portfolio performance improvements, portfolio managers considering pre-trade implementation cost estimates have a better basis to judge whether to reconsider a transaction if current market implementation costs are deviating significantly from the initial estimates. By having a common understanding of implementation costs between both portfolio managers and traders, communication is enhanced pre-, during and post-trade. Where the trading function was previously simply a transaction execution function, it now becomes part of the integrated investment decision making process.

Tradeweb proves that SEFs have a chance

MTS offers 10 years of historical fixed income data from 17 Countries

First Published Monday, 15th April 2013 from Automated Trader : Automated Trading News

MTS said it is making extensive data from 17 countries available, covering a turbulent period in financial markets



London – MTS, the European fixed income trading venue, said it is making 10 years of historical debt data available.

Starting in 2003 and covering events such as the sub-prime failure, the collapse of Lehman Brothers and subsequent Eurozone turmoil, MTS said the data offers a source of historical information for back-testing and building trading strategies.

MTS said it has experienced significant uptick in demand for historical data. The data comes from MTS interdealer market, providing executable quotes, orders and traded prices rather than indicative quotes.

The data covers 17 of the MTS Cash interdealer government bond markets with over 100 participants trading more than 1,100 bonds, all generating around 30,000,000 executable prices each day.

Subscribers are a mix of investment banks, institutional asset managers, pension funds, hedge funds, academics, central banks and regulators.

Simon Linwood, Data Manager at MTS, said: “The fixed income market today is unrecognisable compared to a decade ago, and MTS has certainly played a key role in the growth and innovation of this market. We have continually expanded the MTS network across Europe and we now facilitate 17 electronic interdealer markets. Our historical data offering covers all bond types listed across these markets including treasury bills, zero coupon bonds, FRN’s and linkers, providing an unparalleled level of insight into the market’s reaction to significant events during the last decade.”

Dr. Alfonso Dufour, Director of the PhD Programme, ICMA Centre said: “MTS Historical Data, with 10 years of tick data on euro-area government bonds, is an indispensable resource for researchers, practitioners and policy makers to understand the dynamics of the euro yield curve and sovereign bond liquidity”.

MTS said it offers a range of datasets with increasing granularity to meet customers’ specific needs: from daily trading summaries providing aggregated end-of-day trading statistics, to trade-by-trade files, to high frequency intraday pricing and tick-by-tick full depth data.

FlexTrade leverages OneTick to add enhanced analytics to EMS Platform–platform

First Published Monday, 15th April 2013 from Automated Trader : Automated Trading News

Offers integration of OneTick’s capabilities with FlexTRADER’s strategy servers, advanced analytics and signal generation

New York – OneMarketData has announced that FlexTrade Systems offers integration with OneTick, a single solution for complex event processing and historical and real-time tick data, to enhance FlexTRADER, its execution management system. By incorporating OneTick’s analytics capabilities into its EMS, FlexTrade users can employ a single platform to provide market insight.

OneTick’s CEP capabilities enable FlexTrade users to generate trade signals and develop strategies based on real-time and historical data. FlexTRADER will also tap OneTick’s historical database to store tick data as well as trade data for real-time and post trade analysis across asset classes, including equities, foreign exchange and listed derivatives.

“FlexTrade is committed to continuously evolving our platform to meet the growing needs of our clients,” said Vijay Kedia, President & CEO, FlexTrade. “OneTick empowers FlexTrades’s clients with powerful, proven analytics to efficiently and effectively create, test and implement quantitative strategies from one central platform.”

“Financial firms today are under increased pressure to develop and execute sophisticated quantitative trading strategies that can quickly discover untapped alpha in these volatile market conditions,” said Richard Chmiel, senior vice president at OneMarketData. “OneTick’s unique ability to draw on both real-time and historical data will give FlexTRADER users the advanced analytics they need to generate the superior trade alerts and signals they need to keep them ahead of the competition.”

Richard Chmiel, senior vice president, OneMarketData

Richard Chmiel, senior vice president, OneMarketData

“OneTick’s unique ability to draw on both real-time and historical data will give FlexTRADER users the advanced analytics they need to generate the superior trade alerts and signals they need.”

Trading Technologies to provide connectivity to MexDer

First Published Monday, 15th April 2013 from Automated Trader : Automated Trading News

Trading Technologies new link to Mexican Derivatives Exchange enables TT’s X_TRADER and API users to trade main derivatives contracts listed on MexDer

Chicago & Mexico City – Trading Technologies and The Mexican Derivatives Exchange have announced that TT has linked its X_TRADER derivatives trading platform to MexDer, the marketplace for trading derivatives on Mexican benchmarks, via the CME Group’s Globex platform. TT’s connection launched concurrently with CME’s iLink enhancements for MexDer via Globex, which were released on April 14.

TT’s new link to MexDer allows TT’s X_TRADER and API users to trade the main derivatives contracts listed on MexDer. These products include:

  • Currencies: U.S. dollar-peso futures and euro futures
  • Equity Index Futures: Mexican Stock Exchange IPC index futures and options
  • Interest Rate Futures: three-year, five-year, 10-year, 20-year and 30-year bond futures; 28-day interbank interest rate futures; 91-day T-bill futures; inflation index futures

“We are very pleased to be able to provide our clients with connectivity to MexDer. Access to Mexican derivatives means more choices and opportunities for our clients as they manage business in a competitive global marketplace,” said Harris Brumfield, TT’s CEO.

“This announcement is a very important step for our exchange. TT’s link to MexDer through CME Globex will open new business opportunities from north to south and worldwide routing,” said Jorge Alegria, MexDer’s CEO. “The Mexican derivatives market is becoming a very interesting place for global investors that are also looking to trade the global derivatives markets. TT customers will find in our market a friendly legal framework, free convertibility, no withholding taxes for foreigners trading Mexico and a benefit from posting collateral in the U.S.”

Clients can use TT’s full suite of products to enter and manage trades on MexDer, including X_TRADER and ADL, TT’s visual programming platform for automated trading. Firms have the option to host gateways internally or outsource connectivity to TTNET, TT’s fully managed hosting solution. MexDer also is accessible through TT’s new MultiBroker ASP solution, which is currently in beta.

Harris Brumfield, CEO, Trading Technologies

Harris Brumfield, CEO, Trading Technologies

“Access to Mexican derivatives means more choices and opportunities for our clients as they manage business in a competitive global marketplace.”

Saxo Capital Markets enhanced offering includes new pricing structure–structure

First Published Monday, 15th April 2013 from Automated Trader : Automated Trading News

Target spreads for all FX spot pairs to be reduced, intended to be particularly attractive on EURUSD.

Saxo Capital Markets UK, the multi-asset online trading and investment specialist, has enhanced its offering witht the launch of a new pricing structure.

The new pricing structure will mean that the target spreads for all FX spot pairs will be reduced. Some of the key crosses that will see lower target spreads are:

  • EURUSD 2.0 to 1.5
  • USDJPY 2.0 to 1.5
  • USDCAD 4.0 to 1.5
  • AUDUSD 3.0 to 1.6
  • EURJPY 3.5 to 1.9
  • GBPJPY 7.0 to 3.1

Torben Kaaber, CEO of Saxo Capital Markets UK comments: “Two-thirds of our clients trade three or more asset classes using their own specific choice of market access. Whether via FX spot, forwards, futures, contract options, ETFs or CFDs; Saxo Capital Markets offers the choice to trade most asset classes over-the-counter or on exchange.”

He continued: “Saxo’s platform combines a reliable and flexible way to hedge and trade in a multi-asset environment with global market coverage. Since FX is still a major component of clients’ portfolios, we decided to lower target spreads for all FX spot spreads, including EURUSD and USDJPY, in order to further increase the competitiveness of our platform.”

The new lower spread is intended to be particularly attractive on EURUSD, as John Hardy, Head of FX Strategy, Saxo Bank explains, “It is expected that we will eventually head back to the 2012 lows near 1.2000 and possibly even lower, as Europe either pulls together with the help of huge ECB involvement or moves back into crisis mode in the wake of the German elections in September.”

Torben Kaaber, CEO, Saxo Capital Markets UK

Torben Kaaber, CEO, Saxo Capital Markets UK

“Saxo’s platform combines a reliable and flexible way to hedge and trade in a multi-asset environment with global market coverage.”

Transact chooses Calastone’s re-registration solution

First Published Monday, 15th April 2013 from Automated Trader : Automated Trading News

Integrated Financial Arrangement, operator of Transact wrap service, selects Calastone for electronic, interoperable re-registration solution


London – Calastone, the transaction network, has been selected by Transact to provide it with a fully interoperable, electronic re-registration solution.

Transact, along with a number of other organisations, took part in the original working group to develop the Calastone re-registration service, and has now chosen the solution to automate its own re-registration process.

Under FSA regulations it is compulsory for platforms to offer re-registration of assets. Whilst these regulations do not stipulate that this process must be automated, it is widely acknowledged that an electronic straight through processing (STP) solution is preferable. Calastone’s solution gives the market the ability to move towards same day re-registration.

The service, based on the requirements of a wide range of market participants as identified in the working group, enables the transfer of legal title in fund units between nominees. It enables STP for “in specie” transfer of assets and provides full visibility to all parties during the lifecycle of the transfer.

Calastone’s re-registration solution allows distributors, platforms and fund managers to automate the transfer of assets. Additionally, Calastone will be able to communicate client re-registration instructions to other market participants not on the Calastone Network thus enabling Calastone clients to obtain full market coverage via an interoperable model.

Ian Taylor, CEO of Integrated Financial Arrangements, said: “Calastone has a strong track-record of delivering flexible and scalable solutions that help to reduce risk and costs without requiring additional software spend. It has worked with different sectors of the market in order to deliver a product that is effective, easy to use and enables market participants to communicate with each other in order to automate the processes of re-registration. We welcome the fact that someone has delivered a cost effective system that can transfer legal title in seconds. This can only be a good thing for the end investor.”

Dan Llewellyn, Managing Director for Product at Calastone, said: “We are very grateful for the support of Transact which is one of our key customers, and are fortunate to have such a proactive client base that is always pressing us to deliver new services to help automate business processes. Electronic re-registration has been proven and we are keen to expand the service to other customers in the UK and other domiciles. Phase two of the Calastone re-registration service is to enable our clients the ability to transfer other asset classes including equities, pensions, bonds and SIPPS .”

OTC Clearing and Regulations

My views on OTCs and Regulations

Knowledge Problem

Commentary on Economics, Information and Human Action


4 out of 5 dentists recommend this site

Aditya Ladia's Blog

Forex, Investment and Finance

Soltis Consulting, Inc.

Business Concepts, Ideas, and Information


An Endeavor To Explore The Uncertanity

All About Cyber Security and Financial Technology and Beyond

The Future of FinTech and Cybersecurity are Interlocked: Creating the Secure Future of Financial Technology Today


Read about latest trends in Algo Trading in India or visit our website

Trading Smarter

Thought Leadership, Insight and Product Information from TradingScreen

Tales from a Trading Desk

Noise from an Investment Bank

NPA Computers

Bringing you info about the latest on Internet Technology


An online journal of International Energy Trends, Green Building, Sustainability and Climate, A&E, Data, Security, CryptoCurrency and Social Media

Trade News in Brief

International Economic Affairs & Relations / Regional & International Organizations / Global Commerce & Business


The Single Dealer Platform Community

Carl A R Weir's Blog

A Cross Asset Connectivity and Finance site

%d bloggers like this: