My the literature review component has been submitted today, just in time for Christmas. I’ve learnt a lot but still have a lot more to learn and I’m looking forward to getting back into processing more of my data from the previous field season and preparing for the next (Easter 2019) where I plan to carry out resistivity surveys and collect some repeat UAV surveys in SE Iceland. Thanks to my supervisor Dr Richard Waller for the support and encouragement while writing the literature review and to Dr Sam Davenward and Gethin Evans for proof reading for me.
After leaving myself two hours additional time to get to Birmingham, to account for Birmingham rush hour traffic I found myself abandoning the car in a carpark further away than planned and putting my recent running training to good use, arriving just in time to upload and present my research in the photogrammetry session at the UK National Earth Observation Conference.
I presented the results from the baseline survey I carried out earlier this year. The focus of the talk was on the 3D models and DTMs produced for sites in Iceland, and how they were constrained and process to maximise resolution so that cut and fill models may be created using future repeat surveys. There was interesting to see similar processing methods being used for monitoring blanket bogs, using lidar generated point clouds.
I also learned more about remote sensing and what data is available, unfortunately, the resolution of the satellite data is not sufficient for my buried Ice project, I will be keeping an eye on the data for large-scale changes in surface topography over my PhD using Sentinel data.
If you saw the presentation I welcome any feedback or questions, also if you were not present or wanted to review the presentation at your own pace, the slides are available in the download section of this website.
Busy day today in Leeds. I presented the results from my baseline survey in Iceland to the NSGG Postgraduate Research Symposium and was featured in a poster about geophysical survey with a mausoleum.
I was pleased with how my presentation went and if I didn’t pronounce Skeiðarárjökull right it was very close and sounded vaguely Icelandic. I received lots of good feedback and the other presentations stimulated some interesting discussion about a range of topics.
Unfortunately I missed a few of the presentations after lunch, but I did manage to squeeze in a quick meeting with George from Coptrz to discuss latest drone technologies (particularly thermal imaging and multi spectral cameras) and how they could be incorporated into my dead ice project, taking into account the need to transport the equipment by plane (LiPo batteries and planes don’t mix well).
I look forward to seeing how the other participants project a progress in the future, and would like to thank all involved for organising the event.
Today has been a bit of weird day to say the least, I’m currently on holiday in North Wales but I have some final tweaks to do for my NSGG conference presentation that I have been working on. I realized that this is the first time I will have to pronounce Skeiðarárjökull in public… Panic stations!!!
So whilst hiking down the North Wales slate trails to Beddgelert I have been listening to the pronunciation for Skeiðarárjökull on Forvo.com and repeating as best I can.
Picture this, perfect weather (rare in North Wales), lush green forests, craggy mountains in the distance and a strange man talking nonsense.
My wife Sam and dog Errol think I have gone mad.
Fingers crossed I can master this fiendishly difficult place name before monday.
Following the creation of the 3D models from Iceland, I have been looking into ways of making them accessible and interactive for a range of users. It took little additional learning to move the models from the virtual reality world into augmented reality.
I achieved this using the Augment app and web app. The model works well, especially when combined with a physical tracker to mount the model to, such as a book or a buisness card. The user can manipulate the model by moving the tracker relative to an ios or android device. I would argue that you have more control over the model in virtual reality and that augmented reality for this data set is more of a gimmick, however for creating supplementary data for field courses and course practicals it may have some benefits.
If you want to try it out you can download the app from the app store or google play for free. once you have download the app scan the QR code below, create a tracker (optional) and view one of the 3d models in the palm of your hand or on your desk.