Back to: BANKING, LENDING, & CREDIT INDUSTRY
Banking And Securities Industry Committee (BASIC)
The Banking and Securities Industry Committee (BASIC) was established in 1970 to act as an oversight committee in the settlement of securities and formulate rules aimed at standardizing options processing. The committee has the responsibility of ensuring that there is uniformity in the rules and regulations applicable to the settlement of securities such as the processing of stock and options. Essentially, BASIC was established to standardize, automate and streamline the processing of stock certificates and options.
A Little More on What is Banking And Securities Industry Committee (BASIC)
The bull market experienced in the securities market in the 1960s led to the creation of the Banking and Securities Industry Committee (BASIC). Before BASIC was established, traders make physical exchanges of stock certificates and securities, thereby undergo a rigorous process and elaborate paperwork when processing the transfer of securities or stock. However, as daily trades became voluminous, it became too cumbersome to be handled by paperwork, which led to a paperwork crisis. BASIC was established to minimize the physical exchange of stock certificates as well as the paperwork associated with it. BASIC came into force as a collaboration between the major stock exchanges at that time such as the National Association of Securities Dealers, the New York Clearing House banks and others.
Reference for “Banking And Securities Industry Committee (BASIC)”
Academics research on “Banking And Securities Industry Committee (BASIC)”
Forget Locke?: From Proprietor to Risk-Bearer in New Logics of Finance, Maurer, B. (1999). Forget Locke?: From Proprietor to Risk-Bearer in New Logics of Finance. Public Culture, 11(2), 365-385.
Public Investor Protection and the Need for Regulation of Transfer Agents, Bell, J., & Arky, S. W. (1970). Public Investor Protection and the Need for Regulation of Transfer Agents. Bus. Law., 26, 1649.
Operational risk and reference data: exploring costs, capital requirements and risk mitigation, Grody, A. D., Harmantzis, F., & Kaple, G. J. (2005). Operational risk and reference data: exploring costs, capital requirements and risk mitigation. Capital Requirements and Risk Mitigation. New regulations are imbedding operational risk concepts and the provisioning of operational risk capital in the risk management considerations of globally active financial enterprises. Inherent in new capital calculations is the effect of losses due to faulty reference data, data which is costly to acquire and maintain, duplicative across the industry and of no strategic value, and which comprises 70% of the data content of financial transactions. Faulty reference data has been a persistent impediment to systemic risk mitigation across the global capital and investment markets. Reference data electronically represents financial products and their changing specifications, counterparties, financial intermediaries, corporations, issuers, financial markets, currencies, valuation and market prices, and associated referential information such as credit ratings and fundamental data. This paper attempts to illuminate the effect of faulty data on operating costs, operational risk and economic capital. It also points toward applying solutions that have proven to reduce costs and risk in other industries and in other segments of the financial industry. Standards for product and supply chain participants, long a staple in the retail industry, are long overdue in the financial services industry. Financial industry-wide cost sharing and risk mitigating approaches have long been organized around shared infrastructure entities but, to date, have only been applied to the value portion of transactions (principally quantities, transaction prices and amounts). This paper argues for these same techniques to be applied to the matching and “clearing” of the reference data components of these transactions. The authors conclude that data and its management is costly, averaging $740 million each for the largest financial enterprises, and that faulty data is at the core of significant components of operational losses. Finally, the authors believe that industry-wide collaborative initiatives can reduce data costs significantly, lower capital requirements and mitigate risk.
Legislative Perspective on Automation of the Securities Handling Process and the Background of the Securities Acts Amendments of 1975, Harris, A. B. (1976). Legislative Perspective on Automation of the Securities Handling Process and the Background of the Securities Acts Amendments of 1975. Jurimetrics J., 17, 146.
The Global Legal Entity Identifier System: How Can It Deliver?, Milne, A., & Chan, K. K. (2019). The Global Legal Entity Identifier System: How Can It Deliver?. We examine the global legal entity identifier (LEI) system for the identification of participants in financial markets. Semi-structured interviews with data professionals reveal the many ways in which the LEI can improve both business process efficiency and counterparty and credit risk management. Larger social benefits, including monitoring of systemic financial risk, are achievable if it becomes the accepted universal standard for legal entity identification. Our interviews also review the substantial co-ordination and investment barriers to LEI adoption. To address these, a clear regulatory-led road map is needed for its future development with widespread application in regulatory reporting.