Sunday, February 26, 2006

Java EE 5 Tools Preview

To give Sun credit, they have put out a pretty good preview release of the new Java EE 5 along with a new tools preview. A couple of things I noticed were BPEL support and UML support. It's also pretty fast. It's got XML schema support like Eclipse along with a WSDL editor. I think its XML support is a little less robust (from a graphical standpoint) than the Eclipse version. But I did find at a casual quick glance that the UML support and the BPEL support were fairly easy to work with. I was even able to import the BPEL generated by the Sun tool into webMethods. It's worth a look.

Here is a screenshot of the BPEL composer:

Figure 1



Here is a screenshot of the UML editor:

Figure 2

Tuesday, February 21, 2006

Eclipse 3.1 with WTP 1.0

Still working on some interoperability testing with Eclipse and the new release of the WTP toolset. I found something interesting with working with the document/literal wrapped style of WSDL. Normally in you WSDL design with a document/literal wrapped style, you would have your operation name the same as your input message. And this in fact works for a large portion of the toolsets including, Axis 1.3, webMethods, .Net, and Suns Studio Creator. This does not work however with Eclipse. In Eclipse if you try and generate a proxy or a web service based off that type of WSDL, you will not get the messages deserialized into java beans. In other words instead of a method like createTicket(ServiceRequest request), you will get createTicket(arg1, arg2, arg3, arg4, arg5).

The work around I have found for this is to change the operation name into something other than the message input name. See the screenshot below. Notice that the operation name is different from the input message. With the WSDL setup like this the beans are generated correctly.



Not really the ideal thing to have to do. I'm attempting to work with the Eclipse folks to see if this is a bug or intended behavior.

Friday, February 17, 2006

More on WSDL Design

This is kind of a strange thing to happen but if you run into it this might help. I was importing xml schema into some wsdl using http as the location and got a java.net.URLConnection exception when using Axis 1.3 from the command line and with Eclipse. Here is the import statement:

schemalocation="http://sn000046:8081/schemas/servicerequest.xsd"

Very common import statement, so I couldn't figure out why the error was being thrown. Turns out I was hosting the .xsd schema on an Apache Tomcat server that I haven't defined the content-type for xsd or wsdl as xml. So the response header coming back from the server had no content-type specified. Easy to fix just modify the web.xml either in the individual project or in the global conf/web.xml file.

This is why interoperability testing is important. This same wsdl consumed by other platforms ie .Net and webMethods did not have a problem with the content-type being absence.

Monday, February 13, 2006

WSDL and Eclipse 3.1

I am doing some work on modular WSDL. I was looking for a good WSDL editor that was low cost as in free. Don't know how much hand coding you have tried with WSDL but it can become a bit tricky especially when trying to troubleshoot. It comes in handy to have a tool that shows you where some of your links may be off.

Altova is okay for syntax checking but to see graphical WSDL, you have to purchase a licensed copy. I had been playing around with Eclipse 3.1 and their web tools for a while doing some interoperability testing. I did not realize at the time that they had a built-in XML schema editor as well as a WSDL editor. Turns out it is not really bad for the price. They have just released version 1.0 of the web tools.

Here are a couple of screen shots to illustrate what I mean by graphical layout. In figure 1 you can see the correct linkage being made. This saves a lot of time.

Figure 1.


Figure 2 shows the import schema linked into the messages section. This is also nice in that it lets verify that your linkages are correct.

Figure 2.


Here is a site with some more good tips on using XML schemas and WSDL. I also found another site that where the authors are using modular WSDL in their design. It's a good reference if you are just getting started.

Friday, February 10, 2006

Real Time Integration aka EDA

Brenda Michelson has another excellent post on Event Driven Architecture. For those of you from the EAI world, we use to call this Real Time Integration. Jump over and read her article on Event Driven Architecture and then come back and read about some issues associated with it. I've posted below an excerpt from a white paper I wrote a while back on Real Time Integration challenges. Feel free to chime in with other challenges you have run into when doing Real Time Integration.

Excerpt

Issues associated with real time integration are numerous and can have a large impact on the success of a real time integration project. This white paper is designed to give the developer an overview of these issues before undertaking an integration project.

Background

What is meant by real time? For the purpose of this discussion real time is a business event that is to be processed as it happens or within a short time period there after by another application(s). This time lag could be few seconds or several minutes. This differs from a batch type of integration which is typically defined as a grouping of multiple of events or data points in a scheduled one time event.

This need for real time integration introduces multiple issues and challenges which must be overcome to produce a successful, reliable integration project. A thorough understanding of these design issues is necessary before beginning an integration project.

Integration Challenges

Real time integration brings forth a host of challenges driven by the need to consume business events as they happen. These challenges can be categorized into the following topics: Event Notification, Event Delivery, Event Persistence, Event Recovery, Event Consumption, and Event Monitoring.


Event Notification


In a typical integration scenario, there is at least one event producer and at least one event consumer (multiple consumers are discussed under Event Consumption). The first challenge is notification a business event in the producing application has occurred. How is this event generated, how is it captured by the integration layer, how is it persisted in case of connectivity failure between producing application and the integration layer, and how is it rolled back if needed?.

Most applications do not provide an internal mechanism for providing event notification and storing event information for use by integration platforms. Applications typically have application programming interfaces (API's) for interfacing however these API's usually do not address real time event notification (capture). Api's typically allow event generation (i.e. create an invoice), data interrogation i.e. return a specific invoice, and even import and export of data.

Third Party integration platforms such as webMethods provide "adapters" for common types of third party applications. I.e. Oracle, DB2, Peoplesoft, etc. These adapters are designed to be aware of events in the applications, persist the events at the source when possible, and aid in the deliver of the event and its content to the integration layer. So to summarize, the adapter is responsible for observing the event, capturing the event, persisting the event and aiding in the delivery to the integration layer. The adapter also may be responsible for assisting in "at least once"delivery to the integration layer and/or "once and only once" delivery to the integration layer.

Event Persistence

Persisting the event once it has happened, is the responsibility of the integration adapter. Event persistence is important to maintain the history of the event, the contents of event and the notification of the event occurring. To insure that the event data is recoverable in the event of a transient infrastructure type failure, the persistence should always occur on the source system. This also aids in de-coupling the source application from the rest of the integration layer infrastructure as well as any target receiving system. For example if the integration platform were to be down for maintenance, failure etc then the event would be collected on the source system and sent to the integration layer once connectivity was re-established automatically.

Event Delivery

Delivering the event data to the integration layer once the event has occurred is generally the responsibility of the adapter. There are multiple ways to do this and different adapter types use different methods.

Event Recovery

There are multiple recovery situations that can occur within a typical integration. Designing the integration to be recoverable is a key step in insuring the event does not get lost.

Event Recovery - Infrastructure Failure

Infrastructure failures can come in a variety of forms that the integrations have to take into account. Network failures, application failures, database failures, integration software failures, hardware failures. These types of failures can be accounted for during the design and event recovery can be automated within the integration.

Event Recovery - Logic Failure/Error

Logic failures can occur in the source application, integration layer and target application(s). These failures are typically not associated with automatic recovery. However notification of the event logic error is critical. Once notification of the error is received, the event can be repaired and resubmitted/reprocessed within the integration layer, if the error occurred within the integration layer.

Event Persistence and Event Recovery are tightly tied together during the design phase. If the event is not persisted correctly and at the right place(s) then recovery becomes difficult.

Event Consumption

Event consumption can be separated into two parts: Integration layer event consumption and target application event consumption.

Integration layer event consumption occurs after the hand off from the event source application. Typically in well designed integrations, the event is persisted before it is consumed. Consumption may consist of simply forwarding the event on to interested target applications. In most cases however, the event needs to be translated into the target application's desired format.

The transformation process is an important part of the integration layer. It is at this layer that a common data model can be inserted, also called a common document. This common document plays an important role in the de-coupling of source and target interfaces. By translating the source data into this common document, the target applications do not know about or care about the data structure of the event producing application.

As the common document is passed through the integration layer to the target applications, it will be translated again into the format supported by the individual target application. These different persistence points allow for further de-coupling of the source and target applications.

Target application event consumption occurs when the event has finished its route through the integration layer. The event data has been translated into the target applications data model. The interface into the target application is dependent on how the target application has exposed its methods.

Important design considerations: What happens if the update of the target application fails due to a data error? What happens if two separate applications are being updated and one succeeds and one fails? Is there an event sequencing requirement? What happens if the update of the target application fails due to infrastructure failures?

Event Monitoring

As events are happening in real time, event monitoring becomes critical. Event monitoring can be very complex depending on the overall design of the integration layer. Event monitoring can be divided into several layers: Source and Target Application monitors, Infrastructure monitors, Integration Layer Component monitors, and Transaction monitors.

Application Monitors

Application monitors are specifically designed to monitor the running application. I.e. an Oracle database monitor. These monitors typically have specialized knowledge for the application they are monitoring however they can be as simple as is the application up or down.

Infrastructure Monitors

Infrastructure monitors typically monitor the underlying server hardware, operating system and network.

Integration Component Monitors

Integration component monitors typically monitor the individual pieces of an integration. An example would be the custom queue monitor which alerts when queue depths for a particular integration have exceeded a certain limit.

Transaction Monitors

Transaction monitors are used to monitor an individual transaction from end to end. This is the most sophisticated type of monitor and yields the best results for monitor an integration. It is also the most difficult to implement.

Conclusion

Real time integration can provide great business value to the corporation. Events consumed as they happen can reduce cycle times, make the business more responsive to customers and increase the competitive advantage. Real time integration can also bring a host of issues as this paper has outlined. Understanding these issues prior to undertaking the project will significantly increase the integration's chance of success.

Wednesday, February 08, 2006

Jonathan Schwartz's Weblog

This is a good post from Jonathan on free software. It is good because of one phrase he uses in justifying giving away software, "it amplifies adoption". To grab the new customers, to get the software in the hands of the folks who will be working with it will be the key in the next few years for gaining market share and keeping existing market share. A developer, an architect, a programmer will always tend to recommend, lean toward a solution that they are familiar and comfortable with.

If you are a software vendor with closed source and your strategy for the next few years is to keep it that way, I think we can all see the ending. There is no longer a middle ground for software vendors. There is the upper tier ie IBM, Oracle, Microsoft and then there is everybody else. Everybody else is competing with open source solutions. Competing with solutions that are "good enough". Even the upper tier players are feeling this affect. But the lower and middle tier vendors tend to have a more fragile bottom line that is greatly affected by even a percentage point or two in market share.

Another thought on this. Companies that can successfully implement a changeover to a SOA style of architecture will make this even worse for the lower and middle tier vendors. A SOA based upon the WSF standards and good design principals will make it easier to swap out vendors.

Monday, February 06, 2006

Friday, February 03, 2006

5 careers: Big demand, big pay - Feb. 3, 2006

I thought this was interesting especially in contrast to my previous post about free software. Notice the demand for .Net developers. Of course it also mentions a demand for software quality management analysts. Are the two related? :)

Slashdot | VMware to Make Server Product Free (as in beer)

More software going to the free side. The pressure is increasing on vendors. We appear to be in the middle(maybe the late beginning, its hard to tell) of a transformation of the software industry. Business as usual which is high license fees and high maintenance contracts for questionable quality and even more questionable support is going by the wayside.

This is probably a good thing. Increased quality, better support and innovation could all be the result of this pressure. Only time will tell how it is going to play out. I say again, you should know where you vendors strategies are and how they are going react to this. If they don't have a plan, watch out.

Wednesday, February 01, 2006

Who's killing the software industry the fastest? | Service-Oriented Architecture | ZDNet.com

Joe's got another good article on SOA and Open Source. I do believe that the greatest threat to traditional software vendors is Open Source and not so much SOA. Although as I said before the companies that can harness both correctly will be out in front of their competition. The key is the harness correctly part. Good effective SOA is easier said than done.