Brokerage in an Information Economy

Jerry FOSS <jerry.foss@marconicomms.com>
Kulwinder GARCHA <kulwinder.garcha@marconicomms.com>
Marconi Communications
United Kingdom

Philip TURNER <pjt@ecs.soton.ac.uk>
Nick JENNINGS <nrj@ecs.soton.ac.uk>
Southampton University
United Kingdom

Abstract

Intermediation services--brokerage--are a key component of on-line trading environments. Intermediation is a range of intermediary services between consumers and information sources. One of the key scenarios in such environments is that future on-line enterprises, including brokers themselves, follow a net-centric component-ware paradigm: This assumes a trading environment where services are composed of third-party-provided service components which are subject to commercial procurement from commercial suppliers. This scenario demonstrates that (i) the architectures for the provision of future information and communications services are very dynamic, resulting in layers of service platforms supported by lower level services and components; and ii) this environment for the construction of services is commercially volatile and becomes greatly dependent on commercial intermediaries and brokerage. The scenario continues to consider information services as on-line businesses, which are autonomous and continually re-assessing their own market environment. These automated enterprises would therefore constantly self-organize and reconfigure to fulfill their optimal business development. A realization of this may be as virtual networks of intelligent agents and service components browsing between each other, trading marketing information of potential benefit to trading partners.

This paper describes our initial effort toward building such systems. In particular we aim to build a simulated trading environment, populate it with trading parties, and implement some trading services. This involves building the main trading entities:

A variety of billing strategies and customer types are to be emulated in the model which will then be extended to emulate a (relatively) simple on-line business evolving to provide more adventurous services. The brokers are to start each demonstration-run as "empty shells" with little functionality. As trading starts, the brokers are initiated to acquire components to build their brokerage functions. When they are integrated into functioning brokers they mediate trade between customers and suppliers. Much of the component-ware is agent-based. These agents perform tasks such as information filtering, search, and contractual negotiation.

The project application scenario then considers the brokers evolving to acquire new functions. One of these example functions is to form a trading cartel between participating brokers, with the aim of forming a game-theoretical trading policy. The functionality to perform the cartel negotiation is acquired by each broker as agent-based component-ware. This supplies the simulated market with an observable (and authentic) dynamic. The scenario may then be taken further to observe the effects of one of the brokers breaking out of the cartel and adjusting its prices to attract a higher market share (i.e., a "myoptical" trading policy).

Contents

Introduction

Intermediation services [1] will become increasingly important in the rapidly evolving on-line trading environment. In this scenario, future on-line enterprises (including brokers) and the services they offer will be composed of third-party-provided service components which are subject to commercial procurement from commercial suppliers. There is a need for services to be rapidly created, deployed, managed, and marketed: Services are re-created and re-configured by the hour or by the minute. Services and their components may be leased, licensed, pay-per-use, or acquired under negotiable contracts. This creates a volatile commercial service market where millions of financial transactions of micro-units of currency become greatly dependent on commercial intermediaries and brokerage. Now consider little or no human involvement--any of these on-line enterprises may be completely automated.

The scenario continues to consider information services as on-line businesses which are autonomous and continually re-assessing their own market environment. Automated enterprises constantly self-organize and re-configure to fulfill their optimal business development, that is, virtual networks of intelligent agents and service components interacting between each other, trading marketing information of potential benefit to trading partners. With the increasing need for intelligent marketing and environmental relationship management--services that relate the client to their market--intermediaries may expand their expertise and service base and become providers of consultancy and integration services.

Scope

Intelligent agents are seen as one of the enabling technologies for these scenarios. Agents are often the component-ware referred to in the scenario described above. Their functions may be to provide tools for filtering, profiling, searching, negotiating, and communications functions. However, one of the characteristics observed in early agent-based trading compared to human trading is that the agents' speed and ease of computation and communication can lead to erratic performance with a detrimental effect on the market. This can result in periods of stability suddenly erupting into price wars, whereas the latency of human reaction gives the current markets the traditional comparative stability.

To investigate the "traded component-ware" paradigm (see diagram 2) for service construction, deployment, and management, Marconi Communications and the Department of Computer Science at Southampton University are conducting a project "Brokerage In An Information Economy" [2]. The project aims to investigate the extent to which multi-agent technology may be used to realize the service architecture and produce and manage future network services. A trading environment has been constructed with trading entities--customers and suppliers. The next phase was to implement brokerage businesses using agent technology. Then brokerage business undergoes a simulated evolution where they effectively acquire the components necessary to offer a new range of services. The aims of the project are to derive the agent architectures needed to implement the brokerage services and to investigate the agent interactions resulting from running the simulated market trading processes.

The aim is to derive generic services architectures that can be reconfigured to new applications using service components; in other words, for the project, it doesn't matter what the traded commodities are. However, for demonstration purposes the media industry has been chosen trading. This industry was chosen because it is a strong growth sector for e-commerce; it has a number of converging industries; it has potential to demonstrate many of the dynamics we're interested in; there is a large consumer base; and products are deliverable on-line.

Specific technical aims are to build and maintain a broker node which identifies and processes some specified customers' tasks. It also needs to seek, acquire, and integrate components into the broker node; disassemble components when they are no longer required; deploy and administer services; and re-configure the node according to requirements in the evolving market environment.

The first model produced was a standard on-line order processing system demonstrating a number of generic service elements. Customers were set up with specific profiles and suppliers with specific billing policies. To distinguish suppliers, and to provide a degree of authentic individuality, loyalty and volume discounts were added as variables to their pricing policies. Suppliers specialized in certain price bands and commodity-types, while customers were also categorized toward a specific price band in which most of their requests would be generated. Profiles and data were loaded separately prior to each demonstration run, enabling several data runs with variable data.

The second phase of the project demonstrated the construction and initiation of a broker. Initially a broker is an empty shell--it contains no specific services. As the market initializes, the broker is triggered to acquire the necessary service components which, when integrated, will realize a range of functions. The functions needed are a decision of business policy, so the derivation of this functionality is outside the scope of this project. However, it is estimated that this research should extend into the technology required for the qualitative areas of market intelligence and business development.

The broker functionality must receive requests for commodities from customers. The broker must then acquire costs from a number of suppliers. The broker completes the contract with the customer according to the lowest price received from the prospective suppliers. Some of the supplies may offer volume-discount deals with the brokers.

The model is then to be scaled up to prove the effectiveness and characteristics of large-scale trading environments.

Some authenticity issues were considered for the market model. For example, if the "commodities" were physical entities like compact disks (CDs), are the suppliers to have a finite supply of CDs? However, if the project demonstration is to imitate the market for downloadable software (e.g., commercial MP3s), we could consider that the commodities are supplied from an infinite source. However, commercially downloadable software is not likely to be that simple: Suppliers may issue a license for negotiated numbers of copies, etc. We had to assess which issues of authenticity had any great relevance to the fundamental objectives of the project.

One issue of authenticity we did decide to demonstrate was the ability of brokerages to interwork and negotiate cooperative strategies. There have been a number of studies into the behavior of agent-based market entities [3]. While it was thought that this project should not repeat those studies, we did want to consider cooperative market elements from the point of view of the agent architectures used in this project. One scenario we decided to adopt was for the formation of a cartel in which a number of broker enterprises negotiate a fixed range of market prices. Cartels are often formed in markets, often for monopolistic strategies. Individual traders are differentiated by a range of other value-added services. However, it would be beyond the scope of this project to build elaborate service additionality. The cartel scenario was extended: Once the cartel was operating in some relative degree of stability one of the brokers would break out of the cartel agreement and fix its prices autonomously. This--as in real life--would require that the breakaway business assess its current trading performance, sales, profits, etc., and predict its performance using new prices. The new pricing strategy may derive lower prices (i.e., lower profit margins) but aims to attract a larger market share. The converse strategy consists of raising prices and trying to achieve a larger profit from fewer sales in a smaller market share. These strategies are covered in much detail in [4].

System architecture

Customer, supplier, and intermediary agents present a wide range of technical requirements. All three demand facilities ranging from basic abilities, such as detection of, and communication with, each other, through formation and disbanding of cooperative groups to meta-reasoning about efficient achievement of goals, financial decisions, efficiency, and scalability.

The advanced features of the scenario's intermediaries, for example, require that they dynamically reconfigure. This may occur to the extent that they no longer undertake any tasks in the market place that they had previously. Furthermore, all the agents have facilities to duplicate themselves and collectively create new agents and delegate collective tasks (for efficiency and scalability reasons). In complex dynamic environments and very large or variable-scale multi-agent systems, there are additional criteria for requiring self-building and self-organizing agents or multi-agent systems: the difficulty of their design or inability of designers to re-configure or re-design quickly enough [5].

The simulator in which the scenario agents and market place exist is controlled and monitored by an agent-coordinate distributed network of agent servers. In order to provide a reliable automation of business processes the agents have a logical architecture based on those of ADEPT [6]. Robust negotiation algorithms are taken from [7].

To facilitate the easy creation of customer, supplier, and intermediary agents of differing characteristics, the simulation is controlled and communicated with through various simulator agents. Through these agents, the operational characteristics of the environment in which the scenarios agents reside may be tuned: for example, whether communication is reliable/unreliable, asynchronous or synchronous, computational resource bounds, relative computational speeds of agents, and so on. The simulator agents are also responsible for debugging, monitoring, and display of the state of individual agents, the (simulator/scenario) system as a whole, or data-visualization of abstractions such as market overview against time.

In diagram 3 Fred and Wilma are the workstations on which the agent servers (respectively called Fred and Wilma) reside. Scenario agents are shown in red circles, simulator agents in green rounded boxes. (Note that the servers are themselves individual agents undertaking the task of being servers.) The simulator agents within Fred are shown. They are Controller (the agent responsible for coordinating the execution of scenario agents), ComsMngr (emulation of communication environment), and Trace/Debug and DataMontr (which monitors data about system operation as well as that of the scenario agents). To expedite large-scale tests, the number of agent servers/workstations is not limited and is partially dynamic.

Customer Agents are those agents that purchase the basic market commodity (in our scenario, compact disks). In order to create a dynamic market, end-users' goals to purchase commodities must be simulated. Each customer agent generates these goals according to a probability distribution over a specifiable period of simulated-time. The specifics of a request include a commodity identifier, the required volume, and a deadline for response from suppliers. Commodities are grouped into three price bands: low, middle, and high, for which individual agents have their own price range definitions. End-users' requests include the price band in which the customer expects the commodity to lie. Customer agents will purchase an item if its cost fits within the expected band's range. Generation of the requests is achieved by specifying the start time and length of the cycle, the relative weight of requests in each band, a probability distribution for making volume requests, a target average for volume requested, and definitions of the ranges of the price band ranges. By specifying functions over these parameters, customer agents can produce complex patterns of demand (e.g., monthly demand cycles which have seasonal variation).

Customer agents are capable of dynamically building preferences for suppliers on a per commodity and general basis, propagating information about suppliers and their preferences (recommendations) to other customers. They collectively reason with other customers to create new (intermediary) agents to undertake shared group tasks centrally (reducing total system duplicated tasks such as modeling suppliers' catalogues and ordering bulk amounts cheaper).

Supplier agents represent the businesses that supply the basic commodity of the market (i.e., compact disks). In contrast to customer agents, suppliers are currently limited in terms of dynamics. Their character within the market is defined in terms of the products they sell, and their notions of low, middle, and high price bands. Although the product ranges do not vary periodically, each supplier has the ability to model and generalize customer commodity requirements and offer regular customers' discounts and/or bulk deals. They can also monitor the requests they receive and analyze their responses over time. Reaction to such data may be to start selling often-requested commodities or drop prices slightly to improve the number of sales of given items. The supplier can also collectively reason with other suppliers to create new (intermediary) agents to undertake shared supplier tasks centrally, such as modeling customer commodity requests or individual customer's payment reliability. Suppliers' agents can also undertake new tasks, such as using the intermediary to intelligently route messages to suppliers. Supplier agents can also clone themselves to increase joint throughput (cf. business expansion) or duplicate themselves while splitting their commodity base with the new agent (creation of specialized divisions).

Simulator agents can be instructed to create or destroy customer and supplier agents using periodic distribution functions in order to reflect variation of the number of users or on-line businesses over time.

Finally, with each supplier and customer agent having individually defined price band ranges, cooperating suppliers or customers tend to band together according to their notions of what is low cost, medium cost, and high cost. Similarly, the coupling between individual suppliers and customers reflects similarities of banding: since it is also agreeable if a supplier sells for less than expected, this relationship is non-trivial.

As the operation of the customers and suppliers progresses, the market develops. The dynamics of the agents evolve (due to their respective modeling of the market and their operation). Intermediaries are dynamically created as required. The reasons for their creation are two-fold. First, groups of suppliers or customers create them for efficiency or scalability reasons. Second, they may be created to exploit the market. For example, a supplier may have identified a market for a new commodity (a compound of two or more basic commodities that always seem to be requested together), or it may unilaterally decide to create a new information commodity about the market. It may offer to sell its models of individual customer's reliability in making payments. In this case, the supplier may create the new intermediary with initial goals that cause it to operate as an autonomous business--that is, under its own control. In this situation the newly created intermediary is unlikely to be allowed free access to the commodities and services of its creator. Therefore the new intermediary has to acquire the components it needs to operate. If, for example, it intends to sell market-based information, such information changes over time. As a commodity, this is not bought in with a one-off payment. Rather it is continually bought as needed, or payment for obtaining it x times over a given period is required. In this case it is clear that different agents may have different patterns of demand, so some form of negotiation is required to facilitate the variation. The intermediary must therefore buy the component for negotiation. Likewise, agents needing this commodity must also acquire the negotiation component.

Creation of flexible agents that are capable of the features described above (cloning, modified duplication, delegation of tasks, migration of data, modification of own-behavior, creation of new agents, self-building, fine-grain operator monitoring and control, etc.) is prohibitively difficult using procedural languages such as C++ or rule-base languages such as CLIPS. However, the simulator and all the agents are implemented in a rule-based script, the interpreter for which is implemented in a mixture of such rules, Prolog and C. The source code for the customers, suppliers, and intermediaries (and all but one simulator agent) is dynamically generated.

Conclusions

At present the trading environment successfully emulates numerous customers and suppliers with individual trading characteristics. The broker enterprises are currently being built and will take their positions in the simulated market. It is planned to populate the simulated market with a larger number of customers and suppliers. Each market-demonstration runtime can utilize a wide variation of characteristics for each of the players in the market. This will hopefully demonstrate the performance of agent-based enterprises in specific market conditions. It is anticipated that some of the results of the market performance will be presented at INET2000.

Further development

There are several potential routes of research stemming from the scenarios realized in this project. Diagram 4 shows one model of the increasingly complex information economy. The market consists of collaborations of enterprises networked in virtual organizations. These business relationships are often temporary, and most businesses belong to several such collaborations. Other entities in the market are suppliers of service components, customers, and various intermediaries. Further, many of these entities are capable of acting in a different role in other relationships. Collaborative organizations in future markets are therefore very dynamic. Enterprises need information to derive market strategy. Agents currently seem to be a suitable technology for trawling the market to search, retrieve, and relay information to their host enterprise or organization. Of course, maintaining commercial security is a paramount requirement. There are two areas of development and evolution in this scenario: (i) The agents are now handling and interpreting qualitative information, rather than quantitative computable data; and (ii) Traditional trading entities like brokers and intermediaries are evolving into on-line automated consultative roles, gathering information and becoming trusted neutral commercial entities. These consultative intermediaries would be advising clients on their changing trading relationships with their clientele, providing logistics brokerage and environmental relationship management--research and consultancy relating a client's enterprise to its position in the market. This includes tracking the changing requirements of its customers, intelligent marketing, the threats and opportunities posed by other enterprises in the market, potential alliances, etc. Intelligent Agents seem to offer an essential enabling technology for these processes, in both the search (and acquisition) of components, and for browsing for information from other agents who represent other partners and enterprises. Enterprises may outsource certain of their marketing services, such as the management of the virtual organizations, to consultative intermediaries. All of these businesses (including those in market consultancy roles) need to continually re-assess their position in the trading environment and reconfigure their business management and services deployment. Of course much of this model is beginning to take shape; for example, Application Service Providers (ASPs) are emerging and form a current role in on-line service provision.

A direct application for service component brokerage is in communications networks management, where network and management resources are increasingly supplied from third parties. Bandwidth is already a brokered commodity. Computing power may also be a commodity which may be traded from areas of surplus (e.g., overnight when some networks are very quiet) to areas needing extra capability. "Brokered-net-computing" can offer an international grid of managed virtual networks to supply additional processing power. Security is, of course, a great consideration, as is the potential income to the providers and the lower investment of infrastructure to the potential users. Network resources, such as protocol handlers and other functionality, are commodities, which can be trawled and brokered into a network as/when required.

As the global information marketplace matures into a volatile commercial environment, there is an increasing need for intermediaries to provide services not only to customers and suppliers alike, but also to other suppliers of intermediary services. These trading models will become very complex with a requirement for near-instantaneous reconfiguration as new services emerge and new markets are formed. Brokered component-ware is seen as a necessary technology in this model, and this project is demonstrating how this technology can be realized.

References

  1. Foss, J E, INET'99, Brokering Automated Enterprises, http://www.isoc.org/inet99/proceedings/1d/1d_3.htm
  2. Jennings, N R, Turner, P J, Garcha, K, Foss, J D, Brokerage In An Information Economy, http://www.elec.qmw.ac.uk/dai/projects/brokerage/
  3. Hanson, J E, Kephart, J O, Jakka Sairamesh, Price-War Dynamics in a Free-Market Economy of Software Agents, Proceedings of ALIFE VI, Los Angeles, 1998, http://www.research.ibm.com/infoecon/paps/ html/alife6/alife6_public.html
  4. Hanson, J E, Kephart, J O, Spontaneous Specialization in a Free Market Economy of Agents, Artificial Societies and Computational Markets Workshop, 2nd International Conference on Autonomous Agents '98, St. Paul, Minneapolis, 1998, http://www.research.ibm.com/infoecon/paps/ html/aa98/aa98_public.html
  5. Turner, P J, Jennings, N R, On Scalability of Information Management Agents, Proceedings of the European IT Conference (EITC-97), Brussels, Belgium, http://www.elec.qmw.ac.uk/dai/people/phillip/ publications/Scale_IM_Ag-EITC97.ps
  6. ADEPT - Advanced Decision Environment for Process Tasks, http://www.elec.qmw.ac.uk/dai/projects/adept/
  7. Faratin, P, Sierra, C, Jennings, N R, Negotiation Decision Functions for Autonomous Agents, Int. Journal of Robotics and Autonomous Systems.