Announcing the Availability of the BizTalk ESB Toolkit 2.0

As we have pre-announced during several BizTalk sessions at TechEd North America last month, the BizTalk ESB Toolkit 2.0, formerly known as the ESB Guidance 2.0 – has been released today to the web. It is available on the new ESB page in the BizTalk Developer Center on MSDN.

This toolkit extends BizTalk Server 2009 capabilities to support a loosely coupled and dynamic messaging architecture. It functions as middleware that provides tools for rapid mediation between services and their consumers. Enabling maximum flexibility at run time, the BizTalk ESB Toolkit 2.0 simplifies composition of service endpoints without “hard-wiring” them and also provides management of service interactions at enterprise scale.

Why did we change the name ?

The BizTalk ESB Toolkit 2.0 has been renamed from “ESB Guidance2.0” to reflect the fact that it is now provided as a BizTalk Server 2009 value-add with a better support model. This will hopefully help customers develop mature ESB implementations that will be ready for enterprise-wide deployments.

How does this change support and community engagement?

The BizTalk ESB Toolkit 2.0 Connect site has been created to log bugs with the BizTalk ESB Toolkit Team and to provide updates to additional tools over time. Once you log a bug, someone from the BizTalk ESB Toolkit team will respond to you within five days with an acknowledgment and status. Any future updates or tools will be provided through the Microsoft Download Center.

In addition, an ESB Toolkit Forum is provided on MSDN. These forums target online communities-in particular BizTalk MVPs and other BizTalk community lists. Best-effort assistance will be offered through a closely managed forum. That is, any fixes and responses to questions in the forums are best effort. We will continue to leverage the community to provide peer assistance, although with the capability of issuing fixes if necessary.

What’s new with the ESB Toolkit 2.0?

The BizTalk ESB Toolkit 2.0 provides both architectural enhancements and new capabilities over the previous ESB Guidance 1.0. For more information, see SOA and Web Services section in the New Features in BizTalk 2009 Web page (rename will propagate soon ).

What does the BizTalk ESB Toolkit 2.0 provide?

The BizTalk ESB Toolkit 2.0 provides key building blocks that are required for implementing a comprehensive service-oriented infrastructure (SOI) including:

  • Endpoint run-time discovery and virtualization.
    The service consumer does not need to be aware of the service provider location and endpoint details; a new or modified service provider can be added to the ESB, without interruptions to the service consumer.
  • Loosely coupled service composition.
    The service provider and service consumer do not need to be aware of service interaction style.
  • Dynamic message transformation and translation.
    The mapping definition between distinct message structure and semantics is resolved at run time.
  • Dynamic routing.
    Run-time content-based, itinerary-based, or context-based message routing.
  • Centralized exception management.
    Exception management framework, services, and infrastructure elements that make it possible to create, repair, resubmit, and compensate fault messages that service consumers or BizTalk components submit.
  • Quality of service.
    An asynchronous publish/subscribe engine resolves different levels of service availability and provides high availability, scalability, and message traceability for ESB implementations.
  • Protocol transformation.
    Providing the ability for service provider and service consumer to interact via different protocols including WS-* standards for Web Services. For example, a service provider can send an HTTP Web Service request, which will result in sending a message via the BizTalk SAP adapter.
  • Extensibility.
    Providing multiple extensibility points to extend functionality for endpoint discovery, message routing, and additional BizTalk Server adapters for run time and design time.

How to get started with the BizTalk ESB Toolkit 2.0?

Using PowerShell in BizTalk Post-Processing Scripts

[Source: http://geekswithblogs.net/EltonStoneman]

Often in BizTalk deployments you need to do additional work after installation. Typically your full install process may need to:

  • Install BizTalk artifact assemblies to the GAC
  • Install application dependencies to the GAC
  • Register an application source name in the registry, for logging to the Event Log
  • Create FILE send or receive locations on the local filesystem
  • Add application store configuration settings to Enterprise Single Sign-On (SSO)
  • Add log4net configuration settings to BTSNTSvc.exe.config

You can achieve this with a single BizTalk installer by configuring resources and post-processing scripts, and exporting an MSI from the application. Various scripting languages are supported in BizTalk installations (batch files, VBScript etc.), except the most logical – PowerShell, which gives first-class support for the filesystem, the registry, XML files and .NET objects. You can still use PowerShell by including scripts as resources, and using a batch file as the post-processing script, which acts as a harness to call the PowerShell scripts.

This walkthrough addresses all the points above. The completed BizTalk application is on MSDN Code Gallery here: BizTalk PowerShell Deployment Sample – import and install the MSI to deploy with the PowerShell script, or browse the ZIP file to see the scripts and resources.

1. Install BizTalk artifact assemblies to the GAC

This is straightforward, set the resource option “Add to the global assembly cache on MSI file install” to true – this happens by default if you add a BizTalk Assembly resource in the Administration Console:

Using the command line though, this is not the default option so you need to explicitly set -Options:GacOnInstall in BTSTask:

btstask AddResource

-ApplicationName:PowerShellSample

-Type:BizTalkAssembly

-Options:GacOnInstall

-Source:PowerShellSample.Schemas.dll

-Destination:%BTAD_InstallDir%\PowerShellSample.Schemas.dll

2. Install application dependencies to the GAC

As 1), except the resource type is System.BizTalk:Assembly (in BTSTask you can omit “System.BizTalk”). The command requires the same flag to add to the GAC on install:

btstask AddResource

-ApplicationName:PowerShellSample

-Type:Assembly

-Options:GacOnInstall

-Overwrite

-Source:.\Dependencies\SSOConfig.dll

-Destination:%BTAD_InstallDir%\Dependencies\SSOConfig.dll

In this case, I’m installing the SSOConfig assembly (from SSO Config Tool) which provides static .NET classes for accessing the SSO application configuration store. The Overwrite flag is set in case the resource already exists in another application.

3. Register an application source name in the registry, for logging to the Event Log

To log to the Application event log with your own source name, you need to add a registry key with the app name, and the name of the handler:

HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Eventlog\Application\PowerShell.Sample

In PowerShell, this is done in using the New-Item cmdlet to create the key, and New-ItemProperty to set the key value:

New-Item -Path ‘HKLM:\SYSTEM\CurrentControlSet\Services\Eventlog\Application\PowerShellSample’ -Force

New-ItemProperty -Path ‘HKLM:\SYSTEM\CurrentControlSet\Services\Eventlog\Application\PowerShellSample’ -Name ‘EventMessageFile’ -PropertyType ExpandString -Value ‘C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\EventLogMessages.dll’ -Force

(HKLM: is a PowerShell drive mapped to HKEY_LOCAL_MACHINE, and the -Force flag overwrites existing values).

To execute the PowerShell script on install, we need a batch file which BizTalk can run as a post-processing script. The batch file is very simple, just separating install and uninstall logic to individual PowerShell scripts, and redirecting script output to a log file:

cd “%BTAD_InstallDir%\Deployment”

if “%BTAD_InstallMode%” == “Install” ( powershell “.\PowerShellSample.Install.ps1” >> PowerShellSample.Install.ps1.log )

if “%BTAD_InstallMode%” == “Uninstall” ( powershell “.\ PowerShellSample.Uninstall.ps1” >> PowerShellSample.Uninstall.ps1.log )

Both the CMD and PS1 files need to be added as resources to the BizTalk application. The PS1 files are of type BizTalk:File, and the CMD harness is of type BizTalk:PostProcessingScript:

btstask AddResource

-ApplicationName:PowerShellSample

-Type:File

-Source:.\Deployment\PowerShellSample.Install.ps1

-Destination:%BTAD_InstallDir%\Deployment\PowerShellSample.Install.ps1

btstask AddResource

-ApplicationName:PowerShellSample

-Type:PostProcessingScript

-Source:.\Deployment\PowerShellSample.PostProcessing.cmd

-Destination:%BTAD_InstallDir%\Deployment\PowerShellSample.PostProcessing.cmd

4. Create FILE send or receive locations on the local filesystem

If you need to create static file locations, the same New-Item cmdlet is used with the filesystem provider. Specify the full path for the directory and any intermediate directories will be created if they don’t exist. Use the -Force flag to suppress warnings if the directory already exists:

New-Item -Path ‘c:\receiveLocations\x\y\z’ -ItemType Directory -Force

Note that the resources in the BizTalk application are copies rather than references, so if you modify your PS1 files, you’ll need to update the resource (in the Administration Console, select the resource and use Modify… Refresh, or re-run the BTSTask command).

5. Add settings to Enterprise Single Sign-On (SSO)

If you’re using SSO to store group-wide application config, you can create or export an XML file of the settings using SSO Config Tool. We add the .ssoconfig file as a File resource to the application, then in the install script use PowerShell to call a .NET method to import the settings using the SSOConfig.SSOApplication class. The SSOConfig assembly is a resource which has already been deployed to the GAC by this point in the installation:

[Reflection.Assembly]::Load(‘SSOConfig, Version=1.1.0.0, Culture=neutral, PublicKeyToken=656a499478affdaf’)

$configPath = [IO.Path]::Combine($env:BTAD_InstallDir, ‘Deployment\PowerShellSample.ssoconfig’)

$app = [SSOConfig.SSOApplication]::LoadFromXml($configPath)

$app.SaveToSSO()

Note that the PowerShell script has access to all the environment variables set by BizTalk on the install – accessed by prefixing $env: to the variable name, as we do here to get the installation directory from the installer ($env:BTAD_InstallDir).

6. Add settings to BTSNTSvc.exe.config

Modifying XML is straightforward in PowerShell. We want to configure an Event Log appender in the BizTalk service config file by adding the following XML:

<configSections>

<section name=”log4net” type=”log4net.Config.Log4NetConfigurationSectionHandler, log4net, Version=1.2.10.0, Culture=neutral, PublicKeyToken=1b44e1d426115821″ />

</configSections>

<log4net>

<appender name=”Sixeyed.CacheAdapter.EventLogAppender” type=”log4net.Appender.EventLogAppender, log4net, Version=1.2.10.0, Culture=neutral, PublicKeyToken=1b44e1d426115821″>

<param name=”LogName” value=”Application”/>

<param name=”ApplicationName” value=”Sixeyed.CacheAdapter”/>

<layout type=”log4net.Layout.PatternLayout”>

<conversionPattern value=”%date [%thread] %logger %level – %message%newline” />

</layout>

</appender>

<logger name=”Sixeyed.CacheAdapter.Log”>

<level value=”WARN” />

<appender-ref ref=”Sixeyed.CacheAdapter.EventLogAppender” />

</logger>

</log4net>

The Get-ItemProperty cmdlet can read the BizTalk install path from the registry, then Get-Content reads the file – casting it to XML for subsequent processing:

$installPath = Get-ItemProperty -Path ‘HKLM:\SOFTWARE\Microsoft\BizTalk Server\3.0’ -Name ‘InstallPath’

$btsConfigPath = [IO.Path]::Combine($installPath.InstallPath, ‘BTSNTSvc.exe.config’)

$xml = [xml] (Get-Content $btsConfigPath)

On a fresh install, the config file is quite bare and doesn’t include a <configSections> element, so in that case we need to add both <configSections> and <log4net> nodes. We can’t guarantee that other solutions haven’t already modified the config file though, so <configSections> may exist, and <log4net> may also exist – in which case, we just need to add our specific appender and logger values (log4net allows you to define multiples of these in config, and we specify names which we can expect to be unique).

To achieve this, the script checks for each element first, creates it if it doesn’t exist, then adds the specific settings:

$configSections = $xml.SelectSingleNode(‘configuration/configSections’)

if ($configSections -eq $null)

{

$configSections = $xml.CreateElement(‘configSections’)

$firstChild = $xml.configuration.get_FirstChild()

$xml.configuration.InsertBefore($configSections, $firstChild)

}

$log4netSection = $configSections.SelectSingleNode(‘section[@name=”log4net”]’)

if ($log4netSection -eq $null)

{

$log4netSection = $xml.CreateElement(‘section’)

$log4netSection.SetAttribute(‘name’, ‘log4net’)

$log4netSection.SetAttribute(‘type’, ‘log4net.Config.Log4NetConfigurationSectionHandler, log4net, Version=1.2.10.0, Culture=neutral, PublicKeyToken=1b44e1d426115821’)

$configSections.AppendChild($log4netSection )

}

Finally the updates are saved over the original file:

$xml.Save($btsConfigPath)

Limitations

The main limitation with any post-processing script, is that the target environment selected for the install is not available. If you have multiple bindings files, the environment selected at runtime is only alive for the duration of the MSI import – the install has no reference to it, and there’s no record made in the management database (not that I can see, please correct me if there is). This means you can’t switch your script based on environment (e.g. to use different SSO config settings for System Test and Production). If that’s a serious restriction you may prefer to create different MSIs per-environment in your build process, each containing the correct bindings file and scripts.

Specific to this approach, you need to have PowerShell installed on all the target machines, and configured to allow script execution (by default, scripts are not permitted to execute, as a security measure). Hopefully this is becoming the norm. Security also needs to be considered – the sample app writes to the registry and to SSO, so the installing context needs to have explicit permissions. The BizTalk installer runs under a separate security context from the installing user (by a trial-and-error process, this is NT AUTHORITY\ANONOYMOUS LOGON in my Server 2003 VM), so if you’re amending SSO you’ll need to set your SSO Administrators group membership correctly.

Benefits

The completed PowerShell scripts should be straightforward to read and maintain. All the post-installation requirements are implemented using a single technology, and many of the functions are reusable and could easily be parameterised and moved to a central script. The script is easy to test outside of the installer runtime, either manually using a batch file as a test harness (which sets up the relevant environment variables and then calls the post-processing file), or worked into an automated unit test.

The approach is not limited to BizTalk installations, so similar tasks for .NET deployments which are currently done with custom assemblies or Wix script can be isolated in the same way. With BizTalk and .NET installs using the same technology, you’ll build up a library of high-quality, reusable PowerShell scripts.

I also like having the scripts deployed as part of the install, so in combination with the log files, you can see exactly what’s been done to your environment and modify if necessary.

Extensions

With native cmdlets and community scripts, together with WMI, XML and .NET code, you can achieve any desired functionality with PowerShell scripts, and have them rapidly developed and tested. So you can easily add code to update version numbers in config files, remove your assemblies from the GAC on uninstall, access performance counters etc. And PowerShell scripts are just plain text so you can extract them into a T4 template and generate different scripts for different environments in your build process.

What Is an Enterprise Service Bus?

An Enterprise Service Bus (ESB) is an architectural pattern and a key enabler in implementing the infrastructure for a service-oriented architecture (SOA). Real-world experience has demonstrated that an ESB is only one of many components required to build a comprehensive service-oriented infrastructure (SOI). The term “ESB” has various interpretations in the market, which have evolved over time; however, the basic challenge it addresses is the same.

Disabling Itinerary Encryption in the ESB Toolkit 2.0

Those of you that worked with the CTP of the ESB Toolkit 2.0 will notice a new feature in the final release, namely “Itinerary Encryption”. The Itinerary Design now allows you to use a certificate to encrypt your itineraries before you export them out of Visual Studio. This is a key new piece of functionality since your itineraries may potentially contain sensitive configuration information or sensitive processes that you do not want to leave exposed as open text XML.


In the properties window for the Itinerary Designer you can see a new property called “Encryption Certificate”. You can use this property to select a certificate from a certificate store.



Now as important as this option is, what I’m going to write about is how to disable this.  On my dev machine, I did not have any valid certificates installed, so I wasn’t able to select one to use or encryption. This prevented me from validating or exporting my itinerary since the validation tool kept throwing an error. Since this was only a dev machine, I didn’t care about the security of these itineraries, so I really wanted to disable this feature so that I could keep working. Fortunetly, there is a simple and easy way to do this.


If you have installed the ESBT to the default location, you should be able to find a file called “ruleset.config” in the “C:\Program Files\Microsoft BizTalk ESB Toolkit 2.0\Tools\Itinerary Designer” folder. This file contains a list of validation rules the the Itinerary Designer uses when validating or exporting your itinerary.  If you open this file in Visual Studio, you will find a node called <property name=”EncryptionCertificate”>. Inside this node, you will see there are two rules that define how the validation of certificates should be handled. The first rule is the one the designer uses by default and it says that an error should be thrown if you do not have a certificate assigned. I commented out this rule and when I ran the validation routine again, I only received a warning message about the lack of a cert. I was then able to export my itinerary.  Here’s what the modified file looked like for my system.


<property name=”EncryptionCertificate”>
<!–<validator type=”Microsoft.Practices.Modeling.Validation.X509CertificateContainerValidator, Microsoft.Practices.Modeling.Validation”
messageTemplate=”A X509 Certificate is required in the model property ‘{0}’ to encrypt any sensitive property in the designer.”
name=”EncryptingCertificate validator”/>–>
<!– Warning message when not enforcing encryption –>
<validator type=”Microsoft.Practices.Modeling.Validation.X509CertificateContainerValidator, Microsoft.Practices.Modeling.Validation”
messageTemplate=”Some data may not be secured because no X509 Certificate was specified in the model property ‘{0}’.”
tag=”Warning”        
name=”EncryptingCertificate (warning) validator”/>
</property>


Cheers and keep on BizTalking


Peter

Improving the performance of web services in Silverlight 3 Beta

Cross-posted from the Silverlight Web Services Team Blog.

Silverlight 3 Beta introduces a new way to improve the performance of web services. You have all probably used the Silverlight-enabled WCF Service item template in Visual Studio to create a WCF web service, and then used the Add Service Reference command in your Silverlight application project in order to access the web service. In SL3, the item template has undergone a small change that turns on the new binary message encoder, which significantly improves the performance of the WCF services you build. Note that this is the same binary encoder which has been available in .Net since the release of WCF, so all WCF developers will find the object model very familiar.

The best part is that this is done entirely in the service configuration file (Web.config) and does not affect the way you use the service. (Check out this post for a brief description of exactly what the change is.)

I wanted to share some data that shows exactly how noticeable this performance improvement is, and perhaps convince some of you to consider migrating your apps from SL2 to SL3.

When Silverlight applications use web services, XML-based messages (in the SOAP format) are being exchanged. In SL2, those messages were always encoded as plain text when being transmitted; you could open a HTTP traffic logger and you would be able to read the messages. However using plain text is far from being a compact encoding when sending across the wire, and far from being fast when decoding on the server side. When we use the binary encoder, the messages are encoded using a WCF binary encoding, which provides two main advantages: increased server throughput and decreased message size. 

Increased server throughput

Let’s examine the following graph (hat tip to Greg Leake of StockTrader fame for collecting this data). Here is the scenario we measure: the client sends a payload, the server receives it and sends it back to the client. Many clients are used to load the service up to its peak throughput. We run the test once using the text-based encoding and once using the new binary encoding and compare the peak throughput at the sever. We do this for 2 message sizes: in the smaller size the payload an array with 20 objects, and in the bigger one the payload is an array with 100 objects.

Some more details for the curious: The service is configured to ensure no throttling is happening, and a new instance of the service is created for every client call (known as PerCall instancing). There are ten physical clients driving load, each running many threads hitting service in tight loop (but with small 0.1 second think time between requests) using a shared channel to reduce client load. The graph measures peak throughput on the service at 100% CPU saturation. Note that in this test we did not use Silverlight clients but regular .Net clients. Since we are measuring server throughput it is not significant what the clients are.

When sending the smaller message we see a 24% increase in server throughput, and with the larger message size we see a 71% increase in server throughput. As the message complexity continues to grow, we should see even more significant gains from using the binary encoder.

What does that mean to you? If you run a service that is being used by Silverlight clients and you exchange non-trivial messages, you can support significantly more clients if the clients use SL3’s binary encoding. As usage of your service increases, that could mean being able to save on buying and deploying extra servers.

Decreased message size

Another feature of the binary encoder is that since messages sent on the wire are no longer plain-text, you will see a reduction in their average size. Let’s clarify this point: the main reason you would use the binary encoding is to increase the service throughput, as discussed in the previous section. The decrease in message size is a nice side-effect, but let’s face it: you can accomplish the same effect by turning on compression on the HTTP level.

This test was far less comprehensive than the previous one and we did it ad-hoc on my co-worker’s office machine. We took various objects inside a Silverlight control, and turned them into the same kind of SOAP messages that get sent to the service. We did this using the plain-text encoding and using binary encoding and then we compared the size of the messages in bytes. Here are our results:

The takeaway here is that the reduction of message size depends on the nature of the payload: sending large instances of system types (for example a long String) will result in a modest reduction, but the largest gains occur when complex object graphs are being encoded (for example objects with many members, or arrays).

What does this mean to you? If you run a web service and you pay your ISP for the traffic your service generates, using binary encoding will reduce the size of messages on the wire, and hopefully lower your bandwidth bills as traffic to your service increases.

Conclusion

We are confident that binary encoding is the right choice for most backend WCF service scenarios: you should never see a regression over text encoding when it comes to server throughput or message size; hopefully you will see performance gains in most cases. This is why the binary encoder is the new default in the Silverlight-enabled WCF Service item template in Visual Studio.

An important note: binary encoding is only supported by WCF services and clients, and so it is not the right choice if you aren’t using WCF end-to-end. If your service needs to be accessed by non-WCF clients, using binary encoding may not be possible. The binary AMF encoding used by Adobe Flex is similarly restricted to services that support it.

June 7th Links: ASP.NET, AJAX, ASP.NET MVC, Visual Studio

Here is the latest in my link-listing series.  Also check out my ASP.NET Tips, Tricks and Tutorials page and Silverlight Tutorials page for links to popular articles I’ve done myself in the past.

You can also now follow me on twitter (@scottgu) where I also post links and small posts.

ASP.NET

  • Implementing Incremental Navigation with ASP.NET: A nice article from Andrew Wrigley that describes how to use ASP.NET’s Site Navigation system to create a navigation user interface.

  • Syndicating and Consuming RSS Feeds in ASP.NET: A nice article from Scott Mitchell that describes how to work with RSS using ASP.NET 3.5.

  • Using Expression Builders in ASP.NET: Scott Mitchell has another good article that describes a little-known extensibility feature in ASP.NET. 

  • Apply ASP.NET Authentication/Authorization to Static Content: Scott Mitchell has another great article that describes how to apply ASP.NET’s Security features to static content (html/images/etc) using IIS7.

  • GridView Confirmation Box using jQuery: Mohammed Azam has a nice post that describes how to implement model confirmation UI using jQuery.  This is particularly useful for scenarios like saving or deleting data.

AJAX

  • Building Interactive UI with AJAX – A look at JSON Serialization: Scott Mitchell has a nice article that explores the JSON serialization format used by ASP.NET AJAX when calling web-services and server end-points from client-side script.

  • Building Interactive UI with AJAX: Retrieving Server-Side Data using Web Services: Another good article by Scott Mitchell that describes how to call web-services to retrieve data from client-side script.

  • Periodically Updating the Screen and Web Page Title with ASP.NET AJAX: Scott Mitchell demonstrates how to use ASP.NET AJAX to dynamically update client screens with new data.

  • ASP.NET 4.0 AJAX – Client Templates: Damien White has a great post that describes the new client templating support in ASP.NET AJAX.  This provides an easy and powerful way to dynamically create rich HTML UI on the client.

  • ASP.NET 4.0 AJAX – Data Binding: Damien White continues his great ASP.NET AJAX series with this article that describes the new client-side data binding features in the new version of ASP.NET AJAX. 

ASP.NET MVC

  • Tip: Turn on compile-time View Checking: A nice post from Adrian Grigore who demonstrates how to easily enable compile-time checking of your ASP.NET MVC view files.

  • xVal: A validation framework for ASP.NET MVC: Steve Sanderson writes about his cool xVal validation framework for ASP.NET MVC.  This enables you to perform both client-side and server-side validation of model objects.  I also highly recommend checking out his Pro ASP.NET MVC Framework book – it is absolutely fantastic. 

  • DataAnnotations and ASP.NET MVC: Brad Wilson (a dev on the ASP.NET MVC team) has a nice post that describes how to use DataAnnotations to annotate model objects, and then use a model binder to automatically validate them when accepting form posted input.  DataAnnotation support will be built-in with the next version of ASP.NET MVC.

  • More ASP.NET MVC Best Practices: Maarten Balliauw has a nice blog post that summarizes some nice ASP.NET MVC best practices.  Also check out his ASP.NET MVC 1.0 Quickly book.

Visual Studio

Hope this helps,

Scott

Who moved my cheese?

Hi all

Just a quick book review, since I have just read the “Who
moved my Cheese
” book.

It only takes like an hour to read, and it is the story of two mice and two people
who are searching for cheese, where the cheese is a metaphor for something you really
want to have. At some point, they run out of cheese, and the four characters take
very different approaches to the changes in the environment. The point of the story
is, naturally, that you can most likely identify yourself as one of the characters
and after reading the book, perhaps you have learnt something about how you react
to changes and how you should react – and hopefully improve your
life.

Easily read with lots of points. Read it



eliasen