Bonus Projects

EVENTS

Gdansk Forum 2011

BLOGS
Pullo2

20.02.2017 14:01One-way ticket from Matlab to R, and back! (BONUS BALTICAPP - Race Against Eutrophication Blog) Matti Sihvonen

Hi everybody!

I’ve decided to write this somewhat monthly blog post in somewhat more technical fashion compared to my previous posts, which are written in very general fashion. The methods that I’ve been using so far in writing my first two articles have been non-linear weighted OLS estimation and dynamic programming. I’ve used Matlab for both of these. Matlab is a great program for many things, particularly optimizing, as far as my experiences are considered. However, I’ve started to question the reliability of the estimation results that Matlab provides. In addition, I haven’t been able to get all the statistics that I would need for the proper analysis. These statistics include the basics standard errors, t-values and p-values for the estimates. I also can’t reproduce the goodness-of-fit statistics by basics calculations myself. I noticed that the 95 % confidence intervals that Matlab gives for the estimated parameters are very wide. Those even go beyond the range were the function is determined, which I found very suspicious. Thus, I turned to R, which is very much-applied program for statistical analysis. I learned to do the weighted non-linear estimation in R and I finally got all the statistics that I needed. In addition, I understand the statistics it provides and there’s nothing mysterious going on, which I found very important when the actual research work is considered; one has to know where all the numbers have come from.

Needless to say, I had to reconsider some models and results. However, although I found R more suitable for statistical analysis than Matlab, the issue with non-linear estimation in R is the starting values for the estimates. It seems that those have to be very close to final values so that the program will be able to do the fitting. So how do you get to those starting values? I used Matlab for those. Matlab’s curve fitting toolbox, although I don’t trust in the statistics it provides, is very useful, as it illustrates the curve and the residuals right away. It also is very successful in fitting the curve or surface even with the default starting values, once the iteration account is set high enough. This might be related to Matlab’s good optimization abilities, as it is matrix-based program, and estimation is optimization, after all. Thus, I used Matlab for initial examination for a particular functional form and to get the starting values. Then I inserted the functional form and the starting values into R to get the final estimates for the parameters as well as the associated statistics. Then I used Matlab again for drawing illustrative simulation figures and for the economic optimization. I must say that for those purposes the Matlab is superior compared to R, at least as far as my very limited programming skills are considered. All the figures in my work are drawn with Matlab, with the exception of one schematic diagram.

Anyway, now that I’ve been shifting back and forth between Matlab and R a couple of weeks, I believe that the first article is finally ready and it’s time to continue the work with the second article, which focuses entirely on optimization. In first work we concentrated on model derivation and examination of the structural and parameter uncertainty. In the next paper we will examine analytically and numerically the private optimums a bit further and then we move to examine the social optimum, where also the environmental externalities are considered. I will write more about those in the upcoming blog posts. I guess my main tool when doing the second article will be Matlab again, because at least currently is seems that no statistical analysis will be involved.

Thus, see you next time!

Best regards: Matti Sihvonen

Read more | 0 Comments

 

30.01.2017 11:264th annual BONUS COCOA meeting (Sources & Sinks: A Tale of Coastal Biogeochemistry - BONUS COCOA) Dana Hellemann

Eventually, everything has to end. Also the BONUS COCOA project, which still has 11 months to go and thus it is not time to say goodbye YET.... however, last week´s 4th annual meeting was at the same time also already the last of our annual meetings.

Thus, the pressure to finish up is on, but luckily motivation and good mood as well. The necessity for synthesizing our work in mind, we used the 3 1/2 days of meeting very well with exchanging results from all involved disciplines and making plans on how to proceed in the time left. More details on the content and what magic has to do with science you can also read on the COCOA benthosphere blog: My great German adventure.
 
annualmeeting2017
The happy participants of the 4th annual BONUS COCOA meeting at the Institute for Baltic Sea Research (IOW), Warnemuende. Missing on the picture: Alexander, Colin, Erik, Heather, Karen and Markus.


Thanks to the project coordinators Jacob and Daniel for the constructive meeting, and special thanks to Maren and her team for organizing and hosting it at the Institute for Baltic Sea Research (IOW) in Warnemuende. A direct view to the beach is always helpful when discussing matters on coastal ecosystems!

Read more | 0 Comments

 

PUBLIC ENGAGEMENT AWARD

Events kuva







BONUS public engagement award competition is open and celebrates the best public engagement activity or product developed by BONUS projects. Make sure that your suggestions get nominated!



Tweets from @BONUSBaltic