Terracotta Blends Low Latency and Big Data (remember they bought APAMA)


Terracotta Blends Low Latency and Big Data

http://low-latency.com/blog/terracotta-blends-low-latency-and-big-data/?utm_source=weekly&utm_medium=email&utm_campaign=ll_13-07-18

Software AG’s Terracotta unit has introduced its Universal Messaging product, a middleware platform spanning enterprise to web to mobile connectivity, providing low latency performance and big data transport as appropriate – and aligning with the emerging ‘big data in motion’ paradigm.

“Big, fast data at your fingertips requires seamless real-time data movement, no matter what type of system or device you are using,” says Eddie McDaid, managing director of Universal Messaging at Terracotta.

Previously offered as part of the webMethods SOA middleware offered by its parent, Terracotta is integrating Universal Messaging with its BigMemory in-memory technology and In-Genius analytics offerings. The origin of the product is with my-Channels, which Software AG acquired in April 2012.

Universal Messaging offers common APIs – for C++, C#, Java, VBA, .Net and Python – to support publish/subscribe, message queue and peer-to-peer communications. It also supports a number of different transports, from low-latency shared memory and multicast for enterprise applications, to fuller function sockets and HTTP/S for web and mobile.

Indeed, the product supports a number of electronic trading services, including for FX and fixed income, from the likes of Deutsche Bank, UBS, Credit Suisse, JPMorgan Chase and Tullett Prebon. 

Separately, Terracotta this week completed its acquisition of complex event processing vendor Apama, which is also likely to form part of its big data platform, offering the ability to perform analytics on streaming data

Big Cable’s Sauron-Like Plan for One Infrastructure to Rule Us All


Big Cable’s Sauron-Like Plan for One Infrastructure to Rule Us All

 http://www.wired.com/opinion/2013/07/big-cables-plan-for-one-infrastructure-to-rule-us-all/

When Liberty Media chairman John Malone talks, it’s a good idea to pay attention. And this month, the craggy, whip-smart, billionaire cable mogul has set his sights on having the entire cable distribution industry charging for buckets of bits. Which means the Internet in America — as well as in the U.K., Belgium, Holland, Germany, and Switzerland — is in big trouble.

The issue is “cableization” of the entire Internet Protocol enterprise. After all, the cable distribution pipe is just a giant set of channels that will be dynamically reallocated between “Internet” access and other IP-based cable-provided services.

Malone’s bet (his word) is that we’ll all be buying channels from our local cable guy in the form of IP packets, and the cable industry will pull off the unrestrained monetization of its long-ago sunk cost in installing local monopoly distribution networks:

“…over the years more and more content is going to come all IP, all platforms, random access. And as that happens, the bandwidth demands are going to force market share cable’s way… [I]f cable can get its act together … I wouldn’t be surprised if you’d see over the top service providers that are wholesale to the cable operator, retail to the consumer, and that are bundled and discounted with the broadband connectivity side of the product offering. As that transpires, I think it’s going to change the game pretty dramatically.”

Malone calls this “creating value off the scale of a cooperative industry.” But creating this value for them is bad news for the rest of us.

 

Susan Crawford

Susan Crawford is a professor at the Cardozo School of Law  and an adjunct professor at the School of International and Public Affairs at Columbia University. She is also a Fellow at the Roosevelt Institute. Crawford has been an ICANN board member and a Special Assistant to the President for Science, Technology, and Innovation Policy. She is the author of Captive Audience: The Telecom Industry and Monopoly Power in the New Gilded Age.

What Happens When One Cable Rules Them All

Malone is my favorite cable guy because he’s frank, smart, and refreshingly outspoken: He admitted in 2011 that “cable’s pretty much a monopoly now” because it’s the only terrestrial network that can provide the high-capacity, low-latency connectivity needed for the applications of the future.

Cable’s only real competition is Verizon’s bundled Internet, telephone, and television over a fiber-optic communications network (FiOS) which “ran out of steam” (as Malone put it in last month’s shareholders’ meeting) when Verizon stopped expanding a few years ago. And AT&T’s U-Verse product doesn’t run fiber all the way to homes or provide the bandwidth of a cable connection.

All of this means, according to Malone, that “Cable is clearly winning in the U.S. broadband connectivity game.” Liberty Media is energetically back in that game, and Malone’s got big plans for the global future of the Internet. With 25 million subscribers worldwide following his acquisition of British cable company Virgin Media in February 2013, Malone’s company is now the largest cable distributor in the world.

Now he wants to “get together” with the other cable giants to “create global scale.”

In February, Malone bought 27% of the fourth largest cable distributor in America, Charter, a company that faces minimal competition from either FiOS (just a 4% overlap) or U-Verse (20%). Charter’s balance sheet and Malone’s access to long-term, low-interest financing will allow him to roll up additional cable distribution companies across the country. Meanwhile, Malone’s sorry he ever sold his TCI cable systems to AT&T for $54 billion in 1999, because he knows “the most addictive thing in the communications world is high-speed connectivity.”

While innovators around the world want to develop world-changing applications that require a lot of bandwidth — think telemedicine, tele-education, anything requiring “tele”presence — they’re in for a shock.

Because they (or their users) will have to pay whatever cable asks for the privilege of that reach.

That’s Malone’s plan: He wants the cable industry to sit right in the middle of the road that runs between online innovation and users, asking for tolls from applications and users alike.

He wants the cable industry to sit in the middle of the road between online innovation and users, asking for tolls from applications and users alike.

How Will They Get Away With It? The Plan Behind the Plan

To make his plan work, Malone wants the cable industry to act collectively. His logic: Ensure that no maverick breaks ranks and provides users of IP bits with unlimited capacity at a reasonable price.

The key tool he’d like the industry to use to bring this vision to life is tiered pricing on the user side.

We’ve known for a while that the cable industry is interested in charging for buckets of bits used during a given period of time. We also know that tiered pricing is based on justifications such as fixing congestion, or recouping network investments.

But tiered pricing has little or nothing to do with either of those things.

Having made their significant network investments some time ago, the big cable guys are in harvesting mode and have been reaping enormous revenues for years. Comcast’s and Time Warner Cable’s revenues of $172 billion (between 2010 and 2012) were more than seven times their capital investment of $23 billion during that same period. Not only are all of the big cable companies’ revenues exponentially larger than their capital expenses, but this difference is getting much larger over time [see chart].

Tiered pricing has little or nothing to do with fixing congestion or recouping network investments.

Usage caps are aimed at “fairly monetiz[ing] a high fixed cost,” former FCC Chairman Michael Powell said earlier this year. (He’s now the head of the National Cable & Telecommunications Association, which is clearing the way to drop the words ‘cable’ and ‘telecommunications’ from its brand by renaming itself “NCTA: The Internet and TV Association”.) The caps are not aimed at addressing high-bandwidth uses at peak hours, which might degrade the online experience of other users. (Outside peak hours, it makes no difference to the functioning of the network if someone is downloading a lot of bits.)

In a non-competitive local market, data caps are excellent tools with which to make as much money as possible from an existing monopoly facility. Although cable distributors could charge end-users a low flat fee for high download speeds — and Malone is confident that he’ll get his systems to gigabit downloads with very little investment — they have no reason to.

So Malone’s planning a use-based program that goes into broadband connectivity, “so that, you know, Reed [Hastings, CEO of Netflix] has to bear in his economic model some of the capacity that he’s burning … And essentially the retail marketplace will have to reflect the true costs of these various delivery mechanisms and investments.”

What he’s really saying is: Anyone who wants to use my pipes will have to give me money.

A data consumption cap has the same effect as a priority lane or a toll-free lane for favored apps.

The cable industry, already gingerly exploring tiered pricing and usage-based billing, argues that such consumption caps are fair. They’re not choosing winners and losers, they say, they’re just drawing lines that affect every online application equally.

But that’s not true. A data consumption cap has the same effect as a priority lane or a toll-free lane for favored applications. It will reliably dampen demand across all users for any online application that is subject to the cap, according to French consulting firm Diffraction Analysis’ November 2011 report, “Do Data Caps Punish the Wrong Users?”

So some big guys will pay to avoid the cap, and little guys will be stuck trying to reach new customers who are worried about overage charges.

It’s not too late to pay attention to John Malone.

Knowsis Offers Social Media Sentiment Data to Support Trading Strategies


Knowsis Offers Social Media Sentiment Data to Support Trading Strategies

http://low-latency.com/blog/knowsis-offers-social-media-sentiment-data-support-trading-strategies/?utm_source=weekly&utm_medium=email&utm_campaign=ll_13-06-27

Start-up data and analytics firm Knowsis is working with an initial five clients to incorporate its social media sentiment data into their trading strategies. The company declines to name the clients, but says one will go live imminently with strategies built around signals from Knowsis data, while others are considering how to add sentiment data as an additional parameter to their strategies.

London-based Knowsis was formed as a private company in January 2012 with seed investment from Method Investment and Advisory, a funds and quant trading house that Knowsis can also work with to develop and check its concepts. Knowsis’ founder and CEO Oli Freeling-Wilkinson is joined by chief technology officer Mark Unsworth, a technologist and developer most recently working with online music company 7digital, and two further employees.

The company was conceived by Wilkinson in 2010, when there was little financial market recognition of the potential of social media, but he says the question now is no longer whether social media is important, but how to use it. He adds: “Any quantitative trader, hedge fund or risk manager should be interested in data that is proven to help make decisions and money.”

While the company isn’t naming names – its own is based on gnosis, the Greek noun for knowledge – it has worked with quantitative funds to test the use of social media sentiment data in trading strategies. The tests were based on a portfolio of stocks, with each stock being given a sentiment rating based on social media activity and simple trading strategies then being devised. The company says all trading strategies in the test generated returns above those achieved by major indices over the same time period.

Knowsis’ methodology is based on one overarching question: will sentiment make a difference to asset prices? With this in mind, its analysis identifies underlying behavioural trends rather than supporting a ‘trade by tweet’ approach.

The company initially considered using third-party sentiment data systems that are frequently used by large consumer brands to support its service, but found these lacked financial relevance. Instead, it has built technology from the ground up to mine, manage and analyse vast quantities of data produced by social media. It uses scrapers and similar technologies to gather data and filters about 1% of Twitter’s 400 million daily tweets using in-house built algorithms designed to identify Tweets that have financial relevance. It also looks at Facebook, but only to a limited extent as it is a closed system, and scans further blogs and forums that it chooses not to name as it fears malicious activity could skew results. The platform is data agnostic, meaning any social media source could be used to feed the algos and it also includes a machine learning element.

Once data is collected, a sentiment analysis tool is used to aggregate underlying behaviour around an asset and the asset is given a number on the scale of -100 (bearish) to +100 (bullish). A list of securities, predominantly macro assets and some stocks, and their sentiment scores is then made available to users via an application programming interface that is updated in real time, although Knowsis notes the volatility of intra-day conversation and prefers to promote end-of-day aggregated sentiment data that can be validated and checked for false or misleading information.

The company is working across financial markets and includes high frequency, low latency traders that pick up signals in real time, but its drive is towards understanding rather than speed. Wilkinson explains: “Knowsis is not about ultra low latency delivery, but about longer term underlying behaviour trends. The aim is for end-of-day data to be used to project how certain asset classes may behave during the next week.”

With five clients in its portfolio and R&D ongoing, Knowsis’ next plan is a sales push and a search for specialist market data partners that can distribute its social media sentiment data on a global scale.

TRADING AND BIG DATA


via Pocket http://www.securitiestechnologymonitor.com/news/trading-looks-to-big-data-to-uncover-abuse-31789-1.html?ET=securitiesindustry:e4249:190117a:&st=email&utm_source=editorial&utm_medium=email&utm_campaign=SIN_DailyClose__062713 June 28, 2013 at 07:47PM

Perseus Brazil Debuts Market-To-Market Liquidity Platform LiquidPath®


Perseus Brazil Debuts Market-To-Market Liquidity Platform LiquidPath®

http://perseustelecom.com/services/products/liquidpath-brazil/

LiquidPath® combines the Perseus award winning connectivity solutions with the Perseus global market-to-market ultra low-latency network. LiquidPath is a fast and cost effective solution for the deployment of the necessary equipment needed to be staged in foreign markets so that customers do not have to manage the complexities of having “feet on the street” in new emerging markets.

Perseus Telecom customers see a variety of advantages when choosing LiquidPath:

Through efficient and high-performance trading infrastructure ideal for staging Market Data, Order Management (OMS) as well as Algorithmic and High-Frequency-Trading equipment, customers can benefit from state 0f the art equipment ready to be turned on as a service.

Due to complex and static environments Perseus can offer proximity services for Direct Market Access (DMA) platforms, helping customers getting trading with exchanges or counter parties fast, saving time and money.

Customers can enjoy having balanced IT investments with LiquidPath® making it easier to plan and allocate IT expenditures for trading emerging or foreign markets.

“Liquidity Infrastructure” for local and global buy-side, sell-side and service vendors looking to access the Brazilian Securities marketplace.

LiquidPath combines the Perseus award winning fastest connectivity solution with the Perseus Global Market-to-Market ultra low-latency network . LiquidPath is a fast and cost effective solution for the deployment of trading infrastructure into foreign markets so that your firm does not have to manage the complexities of local “feet on the street” in new markets you may want to trade.

Perseus Telecom customers see a variety of advantages when choosing LiquidPath:

Efficient and high-performance trading infrastructure ideal for staging Market Data, Order Management (OMS) as well as Algorithmic and High-Frequency-Trading equipment.

Complex and static environment optimal in colocation and proximity services for Direct Market Access (DMA) platforms. Well balanced IT investments – support for planning of IT expenditures.

LiquidPath Brazil

Perseus Telecom Brazil helps customers meet their requirements for low-latency market access and cost efficient IT products and services saving both time and money.

Infrastructure

  • Exchange proximity colocation
  • Hardware as a service
  • Ultra-low latency connectivity
  • Elasticity (up and downsizing)
  • Managed and Professional Services

Connectivity

  • CT1 – 1st BVMF DC (30µs)
  • CT2 – 2nd BVMF DC (5ms)
  • SP2 / RJ1
  • Internet / Last Mile
  • Global Liquidity Centers Access

Market-To-Market

3-Market-To-Market

HIGH PRECISION TRADING IN COMPLEX MARKETS

Perseus Telecom is an award winning global provider of connectivity and services. We work with best of breed fiber assets globally. Perseus provides customers with the right

network solution at the right price. Whether connecting trading desks to exchanges, establishing global wide area networks, or connecting from Europe and North America to emerging markets in Latin America, Asia and Africa; our customers have the competitive advantage that comes with innovation and experience in finance, banking, technology, law, e-commerce, multi-site enterprise, pharmaceutical, media and telecom sectors.

Informatica Highlights Performance of SMX Messaging


Informatica Highlights Performance of SMX Messaging

http://low-latency.com/blog/informatica-highlights-performance-smx-messaging/?utm_source=weekly&utm_medium=email&utm_campaign=ll_13-06-20

blog | June 18, 2013 – 2:34am | By Pete Harris

Following on from last month’s announcement of Ultra Messaging SMX, Informatica has published a range of latency and throughput performance figures for the shared memory transport, covering a number of programming languages. Messaging latency as low as 39 nanoseconds was recorded, with overall latency more than 16 times lower than tests conducted on an earlier version of the transport, conducted in May 2010.

Ultra Messaging SMX is designed for messaging within a single server – in fact within a single multi-core chip, an architecture that has become increasingly adopted as Intel has rolled out its Sandy Bridge (and now Ivy Bridge) microprocessors – with up to 12 cores on certain Ivy Bridge chips. On chip cache memory is leveraged by SMX, since it is faster than fetching data from standard RAM.

Latency tests were conducted between threads running on the same core (2 threads per core are supported by Intel) and between cores on the same chip. Throughput tests were conducted from one thread to threads across many cores on the same chip. Informatica did not test latency between cores across sockets, since it would have been higher than for a single socket.  

Informatica tested its transport against C, C# and Java APIs, noting that trading systems are often built using a number of languages and so such support is a typical requirement. The test systems for latency included one server with an Intel Xeon E5-1620, with 4 cores, clocked at 3.6 GHz, while for throughput tests a server with a (pre-release) 10 core Ivy Bridge chip, operating at 2.8 GHz, was used. CentOS and Red Hat Linux operating systems were hosts for the C and Java tests, with Microsoft Windows 7 Professional SP1 supporting the C# tests.  

Some highlights from the tests are:

* Thread to thread latency on same core, for the C API, and 16 byte messages, was 39 nanoseconds. The same for 128 byte messages was 48 nanoseconds, for 512 byte messages was 81 nanoseconds. 

* Thread to thread latency on a sibling core, for the C API, was 103 nanoseconds for 16 byte messages, 111 nanoseconds for 128 byte messages, and 135 nanoseconds for 512 byte messages.

* C# and Java latencies were a bit higher.  For example, latency for 512 byte messages between threads on the same core was 135 nanoseconds for C# and 106 nanoseconds for Java.

* As an example of a throughput test, 16 byte messages were transmitted from one thread to up to 19 other threads on the same chip. With 19 receivers and the C API, throughput of 133.92 million messages/secomd was achieved, without batching of messages. Batching – which increases latency – increased this to 305.34 million messages/second. Informatica found that throughput increased nearly linearly as receivers were added.

While the significant decrease in high frequency trading has reduced the overall need for such low latency transports, Informatica notes that it is still required for other trading operations and strategies, such as arbitrage, market making and smart order routing.

Q&A: Ian Blance of Six Financial Information on the Importance of Evaluated Pricing


Q&A: Ian Blance of Six Financial Information on the Importance of Evaluated Pricing

http://www.referencedatareview.com/article/qa-ian-blance-six-financial-information-importance-evaluated-pricing/?utm_source=house&utm_medium=email&utm_campaign=rdr_13-06-19-general

Q and A | September 27, 2012 – 9:53am

Ian Blance is head of evaluated pricing business development at SIX Financial Information, a major supplier of valuations and pricing services across global financial markets.

Why are valuations such an important aspect of the data business?

The valuation of assets is fundamental to the functioning of the financial system. Without an appropriate value, mission-critical activities would simply not be possible – fund NAV calculations, client reporting, book P&L, performance measurement, risk measures, company accounts, collateral management … the list is endless.  Many of these valuations are collected or calculated and then disseminated by the data industry, which makes the data vendors key players in this piece of the business. When it comes to exchange-traded asset classes, the collection and delivery of valuation prices is relatively uncontroversial. But when it comes to illiquid or thinly traded instruments, the challenge – to vendors and to users – is much greater.

What’s driving the current industry attention to valuations quality?

Two things. First, regulatory and audit oversight of valuation sources and processes has increased markedly since the financial crisis. The G20 placed valuation – in the sense of the essential need to know the true value of assets – at the centre of the turmoil, and initiated a programme to ensure that the issues highlighted by the collapse or bail-out of financial institutions were addressed. The way financial firms value their asset holdings has come under unprecedented scrutiny. Second, those institutions have themselves learned many lessons from the crisis and are determined not to have their fingers burned again!

Which markets and classes are best served by current industry offerings? Which are worst served?

For regular corporate and sovereign fixed-income securities, there are a number of well developed evaluation services, driven by the data strengths of the main data vendors, which have accepted methods and a good track record. There are fewer participants in the markets for mortgage products, and there is certainly more to take into account here from a data management and model calibration perspective. But the vendors in this space also have proven services.

Stepping outside of the fixed-income space, OTC derivative and structured product evaluations have more recent provenance.  There are some newer vendors, as well as smaller niche providers, involved in this segment, but Big Data has yet to come up with a truly comprehensive response here.  The specialist nature of the methodology and data, as well as the sometimes highly subjective calibration and input assumptions, make the evaluation of these asset classes a more uncertain business. OTC structured retail products, in particular, do not yet have a robust source.

When assessing third-party valuations services, what are the key parameters financial institutions should assess?

There are the usual questions relating to independence, reliability, consistency, etc. (and from a user perspective, cost is a much more important factor than many like to admit!). But we believe that in today’s world, the most important factor to judge is the defensibility of evaluations. The fundamental question a user needs to ask is: ‘Can I defend the use of this evaluation to anyone who may ask?’

To SIX Financial Information, ensuring defensibility means two things. First, the user must fully understand how the evaluation was produced. This implies that the full details of the methodology, data inputs and assumptions need to be provided to the user – full transparency. A user should never have to ask ‘how’ an evaluation was generated. 

Second, the user must be able to justify the use of the evaluation to any external party – client, regulator, auditor, risk group, etc. Having all of the necessary information to understand the evaluation is part of this. But so is having the comfort and assurance that there has been some kind of review and oversight of the vendor models and process.

In the case of SIX Financial Information, we felt that it was appropriate to have our Evaluated Pricing Service reviewed and assured by a major international audit firm, and their report on our service is available to our clients.

How important is transparency of methodology, both for suppliers of valuations data and for consuming organisations, who need to assure their own clients of the robustness of their models?

We believe that this is fundamental to the acceptance of an evaluated price; not only transparency of methodology, but its suitability for the job (preferably judged by a source independent of the vendor) and also the data inputs and assumptions that drove the model, operational process and controls. The age of the black box is dead! Above all else, the user needs to trust the robustness of the service and be comfortable recommending it to their clients.

Another factor here is the delivery of transparency not only to the users, but also to their end-clients. Connecting the vendor, the user and the end-client with the same amount of information greatly improves the quality and timeliness of resolving any queries or price challenges.

What about timeliness? Are we heading toward real (or near real)-time valuations pricing?

Given that the asset classes that lend themselves to evaluated prices are, except in some limited areas such as on-the-run US Treasury benchmarks, not real-time tick-by-tick markets, the notion of a real-time evaluation seems a little unusual. There are some services promoting this concept, but it is still new and remains to be seen whether it will develop into the orthodoxy.

Having said that, it is clear that many users now require multiple snap times and this trend is accelerating. The availability of this for all asset classes, however, is patchy. For some complex markets and OTC derivatives, an overnight batch valuations service, which allows the collection of all the required market data prior to evaluations, remains the norm. It is possible that this might improve with increasing access to liquid market data when some of this trading moves on-exchange.

Is there cross-over between valuations services designed for front office applications like trading and those geared toward portfolio valuations in the middle/back office?

This is the Holy Grail: front to back consistency on valuation. But it is surprisingly difficult to achieve in practice.

The whole approach to valuation differs in the front and back offices. In the front office, valuation tends to be forward looking – a trader or a fund manager will have a view, and possibly a position, in a specific instrument and, within reasonable limits, their assessment of the value will be informed by this view.

In the back office, and for vendors who service this market, the approach to valuation is essentially backward looking. A value is produced that reflects all the known market information relevant to this instrument at the time of the valuation. There is no attempt to second-guess this information or to tweak it to reflect a proprietary market or security-specific viewpoint. It would certainly be useful – and arguably sound practice – if the prices used in the front and back office were broadly aligned, but there are no guarantees.

What will be the next major developments in the valuations business?

We have already mentioned defensibility, which we believe will be the major driver for evaluations over the next few years. Increasing frequency of snap times and further expansion of coverage into previously poorly served areas are also likely.

One of the more interesting developments, though, relates to the requirement that many hitherto OTC-traded derivatives be exchange-traded and/or centrally cleared. This raises the prospect that much more market data on these rather opaque markets will become publicly available and consequently available for use in the evaluation process. One can expect that this will greatly improve the process for valuation of these asset types, but could it also be the spark for Big Data finally to get seriously involved in this segment?

 

EUROPA – Commission welcomes Parliament adoption of new EU Open Data rules


EUROPA – PRESS RELEASES – Press Release – Commission welcomes Parliament adoption of new EU Open Data rules.

S&P Capital IQ launches cross referencing tool


S&P Capital IQ launches cross referencing tool

S&P Capital IQ’s Business Entity Cross Reference Service provides cross-referencing capabilities to standardized and proprietary market identifiers

http://www.automatedtrader.net/news/at/142864/sp-capital-iq-launches-cross-referencing-tool

 

London – S&P Capital IQ, the provider of multi-asset class data, research and analytics, has launched a new tool intended to identify systemic and counterparty risk while reducing operating costs.

Its new Business Entity Cross Reference Service provides cross-reference capabilities for over 2 million public and private entities using standardized and proprietary identifiers, including all available Legal Entity Identifier (LEI) codes.

“We have developed this cross-reference tool to aid market participants with their mandates to satisfy greater regulatory requirements, solve for lack of standardization, and remove the burdensome task of processing and tracking legal entity structures” said Rui Carvalho, Managing Director, S&P Capital IQ. “The need for meaningful and broad transparency has created a host of new regulations mandating new and enhanced risk measurement criteria.”

“Firms that manage different sources of information need to understand and maintain, on a daily basis, the linkages between instruments, entities, issuers, securities and sector classifications for better geographical and counterparty exposure reporting” said Roger Fahy, Vice President, S&P Capital IQ. “This is what our solution provides.”

What_colour_is_my_trade


http://thetradenews.com/USA_Features/Industry_Profile/What_colour_is_my_trade_.aspx.

OTC Clearing and Regulations

My views on OTCs and Regulations

Knowledge Problem

Commentary on Economics, Information and Human Action

northoceanscm

4 out of 5 dentists recommend this WordPress.com site

Aditya Ladia's Blog

Forex, Investment and Finance

Soltis Consulting, Inc.

Business Concepts, Ideas, and Information

intradaynifty

An Endeavor To Explore The Uncertanity

All About Cyber Security and Financial Technology and Beyond

The Future of FinTech and Cybersecurity are Interlocked: Creating the Secure Future of Financial Technology Today

symphony

Read about latest trends in Algo Trading in India or visit our website symphonyfintech.com

Trading Smarter

Thought Leadership, Insight and Product Information from TradingScreen

Tales from a Trading Desk

Noise from an Investment Bank

The Main Street Analyst

New York City, Marketing, PR, Communication and Promotion

NPA Computers

Bringing you info about the latest on Internet Technology

duanetilden

The Latest News on International Energy Trends, Green Building, Sustainability, A&E and Social Media

Trade News in Brief

International Economic Affairs & Relations / Regional & International Organizations / Global Commerce & Business

Letters from Nopeville

Nothing to do here

SingleDealerPlatforms.Org

The Single Dealer Platform Community

Carl A R Weir's Blog

A Cross Asset Connectivity and Finance WordPress.com site

%d bloggers like this: