Two questions about your graphs:
Firstly: Why are you plotting
rates of change in CO2 concentration and temperature?
Plotting rates of change will exaggerate noise in a signal, especially if there are known oscillations. El Niño, and La Nina affect affect the distribution of temperature, so you would expect to see a quasiperiodic fluctuation in the rate of change of temperature, with a roughly biannual frequency.
If you want to remove noise, you integrate the signal, not differentiate it.
Secondly: Why are you only plotting a ten-year trend which is less than a single sunspot cycle, which also affects the weather?
I'll give an examople of what I mean by integration:
Here is the average temperature in central England, as it is the longest running set of direct temperature measurements.
[qimg]http://www.internationalskeptics.com/forums/imagehosting/1449447ad7af021290.png[/qimg]
It looks quite noisy, but you might notice that there are fewer low temperatures towards the end of the twentieth century.
However there is a very nice technique in statistical process control, called the cusum. Here is a somewhat simplified discussion:
You can take the long term average (mean) of the data, and the difference of each point from this LTA, then you can add all these difference up and plot how the difference changes with time. If a region is flat, it is running at the LTA; if climbing, it is running above; if falling, it is running below.
Over the entire dataset, the final cusum value will be zero as the total sum above the mean will be equal to the total sum below. If a parameter is increasing, then at the beginning, the data will be below the LTA, so the cusum will fall, then it will pass through the LA, and be roughly flat, and then it will be running above the LTA, so will increase, and the converse for a decreasing process. If the cusum keps crossing the zero point, then there is no trend.
Changes in gradient indicate a change in the process mean.
Here is the cusum for the same data as before:
[qimg]http://www.internationalskeptics.com/forums/imagehosting/1449447ad7aa11c333.png[/qimg]
Now there is no reason to choose the LTA as the "target" value, it is just that this will always add up to zero.
If you choose a different target value, you can see whether the process was ever running at this particular value
Here is an example:
[qimg]http://www.internationalskeptics.com/forums/imagehosting/1449447ad7aa1b774e.png[/qimg]
You can see that the rtemperatures were running about 0.092°C below the historic LTA for most of the 19th century, before increasing around 1900.
You can also see that a few years after starting, there was a cooler period that ended around about 1700. There is a slight increase in gradient sometime in the 20th century, which is hard to see. However it is clearer on an expanded scale, and with a different target:
[qimg]http://www.internationalskeptics.com/forums/imagehosting/1449447ad80ec930f4.png[/qimg]
There was a change between 1960 and 1970, and the average temperature is now running about 0.8°C above the historic LTA, and seems to be increasing again...
The point about this is that, although the data looks noisy,
integrating this data can still show trends. Differentiating the raw data will highlight noise.
ETA: Of course an oscillating process will also repetedly cross the zero line