TUGAIT is one of the most important community events in Europe, many speakers across all countries join the conference and most of them are Microsoft MVPs.
The community events are the higher expression of the passion for technologies because organized just for a pure scope of sharing knowledge and experience.
This year I will present 2 sessions, here the full agenda, one focused to Microsoft Azure, the second focused more to the development aspect.
In the first session, Microsoft Azure EventHubs workout session, I will present
how I see EventHubs, all the technologies normally involved with it and how to use it in real scenario.
For this session I prepared a very interesting demo which involve, EventHubs, Stream Analytics, Service Bus, Data Factory, Data Catalog, Power Bi, Logic App, On premise integration.
I will explain how to manage EventHubs, the most important challenges in a scenario using it and how to solve them.
We will compare EventHubs with the other technologies and how to use it in different scenarios like IoT , event driven integration, broadcasting messaging and more.
In the second session, How to face the integration challenges using the open source, I will present how to integrate technologies today using all the technologies available, the differences between them and the best approaches to use.
I will present the most interesting challenges about hybrid integration and messaging scenarios using Cloud and on premise options.
We will examine all the technologies stack available and how to combine them to solve integration scenarios like hybrid data ingestion, big data moving, high loading, reliable messaging and more.
We will see how to manipulate specific technologies stacks and how to use them in different ways to save cost and to obtain first class solutions.
I’m very excited because TUGAIT is a very important event and it’s a great opportunity to share your personal experience with a lot of people, have a lot of fun in a perfect community spirit.
The TUGAIT is free and you can register here
See you there then!
I faced this error and it’s quite complicate to solve so I’m writing a post just to keep note and hopefully to provide some support to some other developers.
I spent a lot of time around this error and there are many reasons, ways and different scenarios because this error.
I had a look in internet and I tried any explicable and inexplicable workaround I found in Stackoverflow.
In my opinion is a common cause created by different things which are;
- different .net version used by some of the libraries, sometime we inadvertently change that.
- dependency breakdown in Visual studio, in that case we are able to build using MSBUILD but we can’t build using Visual Studio UI.
- Nuget package not updated in one of the referenced libraries.
- Project file with different configuration.
Quickest way to fix this issue is following some main steps:
- Fix the target framework, if you have few projects then is fine but if you have 70 projects or more then could be a problem so,
download the Target Framework Migrator and in one click align all the target framework in all project, this is a great tool to do that.
- Fix nugget, enter in the nugget console
and execute Nuget restore SolutionName.sln
- Align and check the build configuration, right click on the solution and select properties.
Select configuration properties and check that all the projects are using the same configuration.
Check that all proper projects have the build checkbox selected.
Some external project like WIX or SQL or others could have the build checkbox selected, if this is the case then check that the specific project builds correctly or uncheck the box and apply.
- Unload all the project from the solution and start loading each project starting from the base library.
Every time you reload one then rebuild the solution.
At the end of this procedure the problem should be definitely solved.
In the last period, I invested time in something that I think is one of the most important aspect in the integration space, the perception.
During my last events I start introducing this topic and I’m going to add a lot more in the next future.
In Belgium, during the BTUG.be, I shown my point of view about open patterns and in London, during the GIB, I presented how I approach to the technology and how I like to use it, I’ve been surprised by how many people enjoyed my sharing.
The Microsoft stack is full of interesting options, in the on premise and Azure cloud space, people like to understand more about how to combine them and how to use them in the most profitable way.
In my opinion, the perception we have about a technology is the key and not the technology itself and this is not just about the technology but this is something related about everything in our life.
We can learn how to develop using any specific technology stack quite easily but it is more complicate to get the best perception of it.
I like to use my great passion for skateboarding to better explain this concept, a skateboard is one of the most simple object we can find, just a board with 4 wheels and a couple of trucks.
It is amazing seeing how people use the same object in so many different ways, like vert ramps, freestyle, downhill or street and how many different combinations of styles in each discipline and each skater has an own style as well.
Same thing needs to be done for the technology, I normally consider 4 main areas, BizTalk Server, Microsoft Azure, Microsoft .Net or SQL and the main open source stack.
I don’t like to be focused about using a single technology stack and I don’t think is correct using a single technology stack to solve a problem.
For instance, BizTalk Server is an amazing platform full of features and able to cover any type of integration requirement and looking to the BizTalk architecture we can find a lot interesting considerations.
The slide below is a very famous slide used in millions of presentations.
Most of the people look at BizTalk Server as a single box with receive ports, hosts, orchestrations, adapters, pipelines and so on.
When I look to BizTalk Server I see a lot of different technology stacks to use, to combine together and to use with other stacks as well.
I can change any BizTalk Server behaviour and I can completely reinvent the platform as I want, I don’t see real limit on that and same thing with the Microsoft Azure stack.
Microsoft Azure offers thousands of options, the complicate aspect is the perception we have for any of these.
Many times our perception is influenced by the messages we received from the community or the companies, I normally like to approach without considering any specific message, I like to approach to any new technology stack like a kid approach to a new toy, I don’t have any preconception.
Today we face a lot of different challenges and one of the most interesting is the on premise integration.
The internal BizTalk architecture itself is a good guidance to use, we have any main concept like mediation, adaptation, transformation, resilience, tracking and so on.
If we split the architecture in two different on premise areas we open many points of discussion, how to solve a scenario like that today?
Below is the same architecture but using any technology stack available, during the GIB I shown some real cases and sample about that.
I also like to consider event driven integration and in that case GrabCaster is a fantastic option.
My next closets event will be TUGAIT in Portugal, TUGAIT is the most important community event organized in Portugal, last year I had the privilege to be there and it was an amazing experience.
3 days of technical sessions and covering any technology stack, integration, development, database, IT, CRM and more.
Many people attend TUGAIT from any part of the Europe, I strongly recommend to be there, nice event and great experience.
My previous post here.
I keep going with my assessment with Logic Apps, I need to say that more I dig and more I like it and there are many reasons for that.
I love technology, I love it in all the aspects, one of the things that most fascinate myself is, what I like to call, the technology perception.I like to see how different people approach the same technology stack, Logic Apps is a great sample for me as BizTalk Server or Azure Event Hubs and other.
When I look to Logic Apps I see an open and fully extensible Azure stack to integrate and drive my azure processes.In these days I’m pushing the stack quite hard and I’m trying many different experiments, I have been impressed by the simplicity of the usability and the extensibility, I will speak more in detail about that in my next events but there are some relevant points of discussion.
Development comfort zone
Logic Apps offers many options, Web UI, Visual Studio, and scripting.
As I said in my previous post the Web UI is very fast and consistent but we can also use Visual Studio.
To develop using Visual Studio we just only need to install the latest Azure SDK, I’m not going to explain any detail about that as it is very to do and you can find anything you need here.
I just would like you to notice some interesting points, like the project distribution and the extensibility, first of all, to create a new Logic Apps process we need to select New Project/Cloud/Resource Group.
and we will receive a long list of templates, with the Logic Apps one as well.
I definitely like the idea of using the concept of multi templates as Azure resource group, I can create my own templates very easily and release them on GitHub for my team.
In Visual Studio we have two main options, using the designer just selecting the LogicApp.json file, right click and “Open using the Logic App designer” or directly using the JSON file.
The second by the editor which is able to offer the maximum granularity and transparency.
About the development experience I need to say that it is very easy, just right click and Deploy
Logic Apps offers a lot of connectors, you don’t need to be a developer expert to create a new workflow, it is very simple to integrate an external stack and implementing a workflow able to exchange data between two systems, but my interesting has been focused on another aspect, how much simple is to extend and create my own applications blocks, triggers and actions.
I had a look at the possibility to extend the triggers and actions and I need to say that is very simple, we have two options, we can navigate in GitHub and start using one of the templates here, Jeff Hollan has created an impressive number of them and all of them are open source, I normally use these repositories to get the information I need or we can just create a new one from scratch.
Well, how much easy to create a new one from scratch, from Visual Studio we need to select New Project/Aps.Net Web Application/Azure Web API, set our Azure hosting settings
and we only need to implement our Get and GenerateAsyncResponse method able to provide the HttpResponseMessage that the Logic App flow needs to invocate for the trigger.
For the deployment, we can publish from Visual Studio or, very interesting and option for continuous integration, using GitHub, I tried both and I have been very satisfied by the simplicity of using GitHub.
To integrate GitHub we just need to create a new deployment slot in our Web API
The slot is another Web API by REST able to integrate the source control we like to use, we need to select our new slot
and select the source type we like to use, absolutely easy and fast, great job.
I also appreciated the possibility to use the Live Data telemetry which is very useful during the tests and the development and we have the same thing in the Azure portal.
I have so much to say about Logic Apps, you can find a lot of material and resources on an internet and please feel free to contact me for any more information.
I’m also experimenting many interesting things using Logic Apps in conjunction with BizTalk Server, I will speak a lot more in detail my next events, I definitely getting so much fun playing with it.
Working in secure channel with SOAP and WCF sometime could be a very complex activity.
BizTalk Server provides many adapters able to cover any requirement and in case of very complex challenge we can use the WCF-Custom adapter to implement more complex and specific binding settings with very high granularity.
The biggest issues are normally related to the binding (security), customization and troubleshooting.
Sometime we need to face very complex security challenges and the strategy to use to solve the challenge quick as possible is critical.
In case of complex binding the best strategy to use is using a .Net approach in the first step and switching in BizTalk in a second time.
We can use a classic sample as below, a mutual certificate authentication in SOAP 1.2 and TLS encryption with a Java service.
I see two main advantages using the .Net configuration file approach:
Visual Studio provides a very useful intellisense approach and it’s very easy to extend and change the binding and test it very quickly.
Documentation and support
In a security challenge the possibility to use the web resources in the web space is crucial, most of the documentation is related on using the WCF .Net approach and you will find a lot of samples using Web.config or App.Config file approach.
For that reason a .Net approach is faster and easier to use and test.
A binding section for mutual certificate via TLS looks as below.
And below the behaviour section.
findValue=”mydomain.westeurope.cloudapp.azure.com” storeLocation=”LocalMachine” storeName=”My” x509FindType=”FindBySubjectName” />
findValue=” mydomain -iso-400” storeLocation=”LocalMachine” storeName=”TrustedPeople” x509FindType=”FindBySubjectName” />
When we are sure about our tests and that everything is running we I can easily switch using BizTalk Server and create the custom bindings.
The WCF custom adapter in general provides the same sections, what we need to do is create a WCF-Custom adapter and a Static Solicit Response Send Port, after that we can easily insert our bindings and behaviors.
In case of specific settings we can import the bindings as well , a great feature offered by BizTalk is the possibility to import and export our bindings, in this way we can easily experiment very fast any complex binding and import this binding in our WCF-Custom adapter in a second time.
Sometime external services require very complex customization and we need to override protocol or messaging behaviour in the channel, for instance some service doesn’t accept the mustUnderstand in the SOAP header
or we need to impersonate a specific user by certificate in the header or just manage a custom SOAP header.
I’m my experience best strategy to use is developing the custom behaviour in a WCF .Net project, this is the faster way to test the WCF behavior without we need to manage GAC deployments, Host Instances restarts and so on.
when the WCF behavior works we can easily configure it in the BizTalk port.
Using a .Net approach we need to add the WCF behavior by reference.
Configure it in .config file and test/debug it.
When everything is working, we will be able to add the behavior in BizTalk adding the component in GAC and adding the behavior in the BizTalk port.
The WCF-Custom BizTalk Server adapter offers a very good level of customization, selecting the bindings and behavior tabs.
The most complex side in this area is the security and the messaging inspection, I recommend two things to do for troubleshooting, one using Fiddler or WireShark and the second the WCF logging, I recommend to use together as they compensate them each other.
Fiddler is a very powerful free tool, easy to use, just run it and use it.
In case of BizTalk Server we need to configure the framework to use Fiddler, at this point BizTalk offers many easy ways to do that.
By the port if we like to affect the port only.
By the adapter host handler if we want to affect to all the artefacts under it.
For deep level sniffing and we need to sniff Net TCP or other protocols I recommend WireShark, a bit more complex to use but this is the tool.
To configure the WCF logging we simply need to add the section below in the BizTalk configuration file to affect BizTalk services only, in the Machine config file if we want to affect all the services in the entire machine or in the Web.Config to affect the specific service.
<!– DIAGNOSTICS –>
type=”System.Diagnostics.DefaultTraceListener” name=”Default” >
name=”messages” traceOutputOptions=”None” >
propagateActivity=”true” name=”System.ServiceModel” switchValue=”Error,ActivityTracing“>
initializeData=”c:logsapp_tracelogClient.svclog” type=”System.Diagnostics.XmlWriterTraceListener, System, Version=22.214.171.124, Culture=neutral, PublicKeyToken=b77a5c561934e089“
name=”ServiceModelTraceListener” traceOutputOptions=”Timestamp” >
After a while I’m back into Logic Apps, a customer asked me to have a look about the possibilities to use Logic Apps in a very interesting scenario.
The customer is currently using BizTalk Server in conjunction with Azure Service Bus and a quite important Java integration layer.
This is not a usual POC (Proof of Concept), in this specific case I really need to understand about the actual capabilities provided by Logic Apps in order to cover all the requirements, the customer is thinking to move and migrate part of the on premises processing into the cloud.
A migration or refactoring is normally to be considered an important operation which involves many critical aspects like , productivity, costs, performances, pros and cons, risks and important investments.
I started have a look into the new Logic App version and I have been focused in all of these factors without losing the customer objectives.
The first look accessing to the main Logic Apps portal is a clear and very well organized view, a quick tutorial shows how to create a very simple flow and the main page is organized into main categories.
Very intuitive approach is the possibility to start with the common triggers, very interesting will be the possibility to customize this area per developer profile, I’m sure the team is already working around that.
I selected one of the most used, an HTTP endpoint into the cloud in order to consume a process workflow.
Very good impression is the responsive time, a lot faster and all the new features in the UI as well.
The top down approach is very intuitive as it follows the natural approach used for a natural workflow development and the one-click New Step offers a fast way to add new components very quickly.
What I normally like to consider when I look to a technology stack are the small details which are able to tell me a lot of important things, for instance, selecting the New
Step box and switching from the designer to the code view, the UI maintains exactly the status in the selection, considering that I’m using a Web UI this is a very appreciated behaviour.
Looking in the New Step box is clear that the Logic App team is following the BizTalk Server orchestration pattern approach, I like that because this provides the same developer experience used in BizTalk, in that case I don’t need to explain to the developers how to approach to a Logic Apps flow as they already know the meaning of each step and the using.
In term of mediation the approach used by Logic Apps is, what I’d like to call, Fast Consuming Mediation, essentially the process includes the mediation and it provides a fast approach to do that.
This is a different from BizTalk Server where the mediation is completely abstracted from the process, in the case of Logic Apps I see some very good points for using a fast mediation approach.
Logic Apps uses a concept of fast consuming, the approach to create a new mediation endpoint is very RAD oriented.
All the most important settings you needs to create a mediation endpoint are immediately accessible, fast and easy, the HTTP action is a very good example, we just add the action and it creates a new HTTP endpoint automatically, very fast and productive approach, fast consuming.
All people know how much is important for me the productivity, very appreciated is the search box in all the Logic Apps lists features.
Very appreciated the quick feature bar on the left with all the most important tasks and with the search box as well.
Following the quick tutorial, I created a HTTP endpoint and I added a new action, I decided to add the Office 365 Email and just try to send the HTTP content to an email account.
Each option provides a fast-dynamic parameter content which proposes all the public properties exposed by the previous step, this is very useful and fast and it shows the most important options, this is good to speed up the development.
Just saved the flow and it creates the URL POST to use, I tried to send a message using SOAPUI and it works perfectly
If you are interested about the SOAPUI project, just add a new REST project, put the URL created by the Logic Apps flow, set POST, media type Text/xml, write any message and send, you will receive a 202 response.
Check your inbox.
I’m definitely happy about the result, I created an HTTP endpoint able to ingest a post message and send an email in less than 2 minutes, quite impressive honestly.
Thanks to BizTalk360 team and my dear friend Saravana my blog has now a new fantastic look and a lot of new improvements, first of all it is tremendously faster .
Well, I liked my previous style but was old and I think is time for a good refresh, the BizTalk360 guys redo graphic, migrated the blog with all the content, managed the redirections and all the other issues in a couple of days without any problem, fast and easy, this is what I like to call, done and dusted!
I really like the new style, more graphics and colours, I created a new about area and now I need to fix my categories and adding some more new content, the new look definitely motivates me on working more around it.
I have a very nice relationship with the BizTalk360 team and this nice relationship grown up any time more, any event we take the opportunity to enjoy time together, speaking about technologies and have fun.
I’m always happy to collaborate with them, the company is very solid with very strong experts inside and their products, BizTalk 360, TOP product in the market for BizTalk Server and ServiceBus 360, TOP product in the market for Microsoft Azure ServiceBus, prove the level of quality and the effort that this company is able to do, I’m always impressed by the productivity of this team.
We also have some plans around BizTalk NoS Ultimate, best tool ever to optimize the productivity during the BizTalk Server development but this is classified for now
Most of all I’m happy for this new opportunity to work closer together.
Thank you guys you are rock!
Why do people start comparing BizTalk Server to a T-Rex?
Long time ago the Microsoft marketing team created this mousepad in occasion of the BizTalk 12th birthday.
And my dear friend Sandro produced the fantastic sticker below
Time later people start comparing BizTalk Server to a T-Rex, honestly, I don’t remember the exact reasons why but I like to provide my opinion about that.
Why a T-Rex?
In the last year, I engaged many missions in UK around BizTalk Server, assessments, mentoring, development, migration, optimization and more and I heard many theories about that, let go across some of them :
Because BizTalk is old, well, in that case I’d like to say mature which is a very good point for a product, since the 2004 version the product grew up until the 2013 R2 version and in these last 10 years the community build so much material, tools and documentation that not many other products can claim.
Because BizTalk is big and monolithic, I think this is just a point of view, BizTalk can be very smart, most of the time I saw architects driving their solution in a monolithic way and, most of the times, the problem was in the lack of knowledge about BizTalk Server.
Because BizTalk is complicate, well Forrest Gump at this point would say “complicate is what complicate we do”, during my assessments and mentoring I see so many over complicated solutions which could be solved is very easy way and, the are many reasons for that, some time because we miss the knowledge, other time we don’t like to face the technology and we decide for, what I like to call, the “chicken way”.
Because now we have the cloud, well, in part I can agree with that but, believe or not, we also have the on premise and companies still use hardware, companies still integrate on premise applications and we believe or not, integrating system in productive way to send data into the cloud in efficient and reliable mode is something very complicate to do and, at the moment, BizTalk is still the number one on it.
Because BizTalk costs, the BizTalk license depends by the number of processors we need to use in order to run our solution and achieve the number of messages per second we need to consume, this is the main dilemma but, in my opinion, quite easy to solve and this is my simple development theory.
The number of messages we are able to achieve is inversely proportional to the number of wrong best practices we produce in our solution.
Many people make me this question, Nino what do you think is the future of BizTalk Server?
I don’t like to speak about future, I saw many frameworks came up and disappear after one or two years.
I like to consider the present and I think BizTalk is a solid product with tons of features and able to cover and support in great way any integration scenario.
In my opinion the main problem is how we approach to this technology.
Many times companies think about BizTalk like a single product to use in order to cover any aspect about a solution and this is deeply wrong.
I like to use many technologies together and combine them in the best way but, most important, each correct technology to solve the specific correct task.
In my opinion when we look to a technology, we need to get all the pros and cons and we must use the pros in the proper way to avoid any cons.
BizTalk can be easily extendable and we can compensate any cons in very easy way.
Below some of my personal best hints derived by my experience in the field:
If you are not comfortable or sure about BizTalk Server then call an expert, in one or two days he will be able to give you the right way, this is the most important and the best hint, I saw many people blaming BizTalk Server instead of blaming their lack of knowledge.
Use the best naming convention able to drive you in a proper way in your solution, I don’t like to follow the same one because any solution, to be well organized, needs a different structure, believe me the naming convention is all in a BizTalk Solution.
Use orchestration only when you need a strictly pattern between the processes, orchestrations are the most expensive resource in BizTalk Server, if I need to use it then I will use it for this specific reason only.
If I need to use an orchestration, then I like to simplify using code instead of using many BizTalk shapes, I like to use external libraries in my orchestrations, it’s simpler than create tons of shapes and persistent points in the orchestration.
Many times, we don’t need to use an adapter from an orchestration, which costs resources in the system, for example many times we need to retrieve data from a database or call another service and we don’t need to be reliable.
Drive your persistent points, we can drive the persistent points using atomic scopes and .Net code, I like to have the only persistent point I need to recover my process.
Anything can be extendable and reusable, when I start a new project I normally use my templates and I like to provide my templates to the customer.
I avoid the messagebox where I need real time performances, I like to use two different technics to do that, one is using Redis cache, the second is by RPC.
One of the big features provided by BizTalk Server is the possibility to reuse the artefacts separately and outside the BizTalk engine, in this way I can easily achieve real time performances in a BizTalk Server process.
Many times, we can use a Standard edition instead of an Enterprise edition, the standard edition has 2 main limitations, we can’t have multiple messageboxes and we can’t have multiple BizTalk nodes in the same group.
If the DTS (Down Time Service) is acceptable I like to use a couple of standard editions and with a proper configuration and virtual server environment I’m able to achieve a very good High Availability plan and saving costs.
I always use BAM and I implement a good notification and logging system, BizTalk Server is the middleware and, believe me, for any issue you will have in production the people will blame BizTalk, in this case, a good metrics to manage and troubleshoot in fast way any possible issue, will make you your life great.
Make the performance testing using mock services first and real external services after, in this way we are able to provide the real performances of our BizTalk solution, I saw many companies waste a lot of money trying to optimize a BizTalk process instead of the correct external service.
To conclude, when I look at BizTalk Server I don’t see a T-Rex.
BizTalk remembers me more a beautiful woman like Jessica Rabbit
full of qualities but, as any woman, sometime she plays up, we only need to know how to live together whit her 😉
In the last 2 days, I struggled against this error and it was a real nightmare and, because the solution is really wired and complicate to find I decided to keep note about that in my blog and hopefully help some other people.
One day and without any real specific reason I was not able to build my solution anymore, hundreds of CS0006 errors, so my first and usual actions were:
- Clean the solution, nothing…
- Restart VS, nothing…
- Clean and restart the solution, nothing…
At that point I started looking in the network and I found so many articles, tips and hints like, remove all the references and re add again, check the build settings in the configuration manager, fix precedencies but nothing and, to be honest, with 90 projects in the solution, was not a feasible option for me.
Two main assumptions drive me to the solution:
- Previously the solution was able to build and I didn’t touch any code
- I copied the solution in another machine and, with my surprise, it was able to build.
I used process explorer to check what Visual Studio does during the compilation and I noticed a quite interesting number of temporary files related to the nugget packages were created.
The length of the path generated was interesting so I decided for moving my project folder in the C: ROOT and the problem has been solved.
The biggest problem I think was because I added a new nugget package to a project and VS was not able to generate specific temp files and unfortunately, I didn’t receive any exception or warning about that.
So, in case of CS0006 error the first test to do is trying to copy the solution in a shorter path and hopefully is going to fix the issue.
In the last month, I’ve been speaker in two events, the WPC 2016 in Milan and the BTUG.be in Belgium, in both the events I presented the results of my studies around the holistic approach and my point of view about integration.
WPC 2016 Milan, 77 speakers for +400 attendees, makes this event the first event in Italy about IT, technology and innovation, great content and very prepared speakers.
BTUG.be is a technical event focused on Integration and organized by the BizTalk User Group in Belgium, the guys invited me 2 month ago, and I was very happy to accept because I was sure to find an audience of very strong and expert technical people.
In the BTUG.be I enjoyed all the sessions, Microsoft presented a session around integration patterns and SOLID concepts, Pieter presented a very interesting session, What’s new in BizTalk Server 2016, he has been able to present a very detailed session about the new features, Glenn, about Azure Functions, presented a very interesting point of view and of comparison between Azure Functions and the other stacks like Logic App and WebJobs.
In both the event I present my session In my session, my point of view about Integration and what means Holistic approach, I received an impressive great feedback by both the events and I’m very happy by the spontaneous feedbacks that I’m receiving via LinkedIn and email.
Looking in the feedbacks, one of the sentences which I most like is:
“finally someone that makes sense regarding integration, I know exactly what you mean with your holistic approach.
I have tried for many years to explain to people that what you’re saying, and you do it too.
So you have shown me the way”
This was exactly the scope of my session, given to people something to think about, a new view about how to use technologies and how to combine them together.
I think that, in a moment full marketing messages, thousands of technologies and options, this is one of the most important aspect of interest.
Sometime we don’t realize the potential of a technology because focused around the messages we received by the network, some other time we don’t see the technology in a different point of view.
When I approach to a new technology I’m like a child with a new toy, I normally don’t care about how the network defines or categorizes it, I just get the main marketing message and I start playing with this technology.
I like to see any technology from a different point of view, any time, this is the best approach to use to better understand the potentials.
I also use GrabCaster to explain these concepts because it contains all of my studies, GrabCaster is my personal laboratory, it’s a space which contains all of my studies and ideas, I start seeing other companies implementing patterns and concepts already implemented in GrabCaster long time ago and this is a great pleasure for me.
People are following and are looking in GrabCaster code to get new ideas and patterns like, open pattern, layering abstraction, dynamic deployment, open mediation, how to achieve real time performances in BizTalk Server and more, I still need to create more videos and tutorials and I will, family and sport activities permitting.
At the moment GrabCaster contains a lot of great features but this is probably the 10% about what I have in my mind, I will keep going implementing on it and if you like to collaborate please do it.
I will improve this session in the time and I will create more scenarios using, combining and extending what I consider the best technologies to use together at the moment.