Ugr√°s a fŇĎ tartalomra

SoCiS Final Status Report - Cleaning

After a great summer I would like to share with you what I worked on. ūüėä But before that I would like to thank everyone who made that possible. Thank you Shane that you was my mentor this summer and guided me through every problem. I definitely learnt a lot and it was fun. ūüėÉ Tnhks for the SunPy community to take me in and for their every help and I also want to thank ESA and Maxime Perrotin to organize this great program!

And now, we can check out the results! ūüėČ

You can find the documentation in the project wiki  and the theoretical background in my previous posts. Now I want to concentrate on the actual usage of the library in case of an actual problem from A-Z.

Lets say that we have a fits file with several RHESSI visibilities and we would like to process it. First we would like to get the xrayvision module (I assuming that SunPy is already installed).

To get the module we have to check it out from GitHub:
 git clone 

After that we have to install it for example with pip (I am going to ask pip to keep the data in place):
 cd xrayvision
 pip install -e  .

Now we are ready to work on some data!

Ok, you have already downloaded some data. It is there because of the tests what were also implemented beside the features. (You can even find a precommit git hook in the hooks folder if you want to run the tests before every commit.)

I am going to work with the following file:

First we should import the RHESSIVisibility class, numpy, matplotlib and read in our data:
 import numpy as np
 import matplotlib.pyplot as plt
 from xrayvision.Visibility import RHESSIVisibility
 rhessi_objects = RHESSIVisibility.from_fits_file("xrayvision/data/hsi_20020220_110600_1time_4energies.fits")

We have 4 RHESSIVisibility objects now in a list. I am going to work with the first one.
I would like to check out the image first, what we can get from the data. To do that we have to transform it (I am going to create a 128x128 sized image from it):
 image = rhessi_objects[0].to_map_v2(np.zeros((128,128)))  

To view it we can use the matplotlib library.

The result:

We would like to use the CLEAN algorithm on this image. To be able to do that we have to import it:
 from xrayvision.Clean import Hogbom
 from xrayvision.Clean import ReasonOfStop

We would need the dirty beam. I am going to create it from our visibility data. To do that I am creating a copy of the RHESSIVisibility object, and after that I simply replace all of the visibility values with one. I have to multiply this with the visibility data what I would get watching at a Dirac-delta with our instrument.
 import copy
 psf_vis = copy.deepcopy(rhessi_objects[0])
 psf_vis.vis = np.ones(len(psf_vis.vis))
 dirac_vis = copy.deepcopy(psf_vis)
 dirac_map = np.zeros((128,128))
 dirac_map[63:64,63:64] = 1.
 dirac_vis.from_map_v2(dirac_map, center=(0,0))
 psf_vis.vis = np.multiply(psf_vis.vis, dirac_vis.vis)
 psf = psf_vis.to_map_v2(np.zeros((128,128)))

And now we got the dirty beam:

We have everything to use CLEAN! Because it is an iterative method we can set the maximal number of the iteration or a threshold where it should stop. We also have to set the gain.
 clean = Hogbom(rhessi_objects[0], psf, 0.01, (128,128), gain=0.01, niter=1000)
 while clean.iterate(gain=False) is ReasonOfStop.NOT_FINISHED:

We can check why it finished the iteration:

In our case it says that we reached the limit of the iteration count. Ok, how can we get our final CLEANed image?
 result_image = clean.finish(3)

When I called the finish function, I gave the standard deviation in pixels for a Gaussion which is convoluted with the sky map. Our result:

As you can see when you compare it to the original image, it really found some features on the image. To get good and valid result, you have to set right parameters. What you saw in this example were just random values to demonstrate the process! Have fun! ūüėĄ


N√©pszerŇĪ bejegyz√©sek ezen a blogon

Marble Maps - Find your way and explore the world on Android!

Update 1: Google Play still not has the newest version, but it is incomming in the following days
Update 2: There is an open beta version now, you can get it from here from Google Play.

Marble Maps has the following features:
MapMarble Maps uses OpenStreetMap's mapWondering around on the mapYou can move the map with one finger by dragging the map with itIt will increase the zoom if you touch double times fast to the mapYou can also zoom with two fingers (only supported on multitouch devices)Handling your positionYou can check your position on the mapYou can check the distance and direction to your position when your position is not visible You can center the view to your positionRoutingYou can plan routes via interactive placemarks, you have to search for something and after that, you can use the result as a waypoint Also, you can modify the route instead of the interactive waypoints with the route editor which is available from the menu
To get the routing instructions, visit the …

The Earth, on Android

In the previous month I worked on compiling Marble widget to Android. It was a long and hard road but it is here:

(I shot this screenshot on my phone)
The globe can be rotated, and the user can zoom with the usual zooming gesture. Here is a short video example:

The hardest part was to figure out, how to compile everything with cmake instead of qmake and Qt Creator. There are some very basic things what can sabotage your successfully packaged and deployed app. For example if you did not set a version number in cmake for your library...
As you maybe know Marble also uses some elements of QtWebKit, but this is not supported on Android. So I introduced some dummy classes to substitute these (of course, not in their useability) to be able to compile Marble for Android.
You can find here step-by-step instructions, how to compile Marble Maps for Android:
The next steps: We have decided to sepa…