Welcome to the BEAM Forum!

We encourage you to sign in our forum and participate in the BEAM community. The forum is maintained by the BEAM project team who will most likely answer your questions within 24 hours (except during common holidays) - if not done by other community members. Collaborate, share your knowledge and learn from other users!

If you don't find what you are looking for, please also consider the following external forums:

Combination View Flat View Tree View
Threads [ Previous | Next ]
NetCDF format batch processed using the gpt?
toggle
Hello,

I have tried to batch process a bunch of MERIS images that are in NetCDF format using the gpt tool, but it fails. Does the input images can be only be processed if they are seen as .N1 or .dim format?

What would be the recommended procedure to work out with NetCDF format?

Thanks,

Jose
Flag Flag
RE: NetCDF format batch processed using the gpt?
Answer Answer (Unmark)
1/4/12 4:43 PM as a reply to Jose M. Beltran.
I found that indeed is possible and treated as any other file. But I received the error below that I first interpreted as not possible to do it, but I tried again... received the same error messages, and I waited until saw the prompt line again, and it worked!!!
I share it to let users know that they should not panic and might have to wait a bit to see the process going.

These were the errors that I received.

SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Error: Could not load mediaLib accelerator wrapper classes. Continuing in pure Java mode.
Occurs in: com.sun.media.jai.mlib.MediaLibAccessor
com.sun.media.jai.mlib.MediaLibLoadException

I did continues in pure Java mode and finalizes!!! this is great!
Thanks,
emoticonemoticon
Flag Flag
RE: NetCDF format batch processed using the gpt?
1/5/12 6:30 PM as a reply to Jose M. Beltran.
Hello Jose,

good to see that your problem is not a real problem :-)

This error message is from a library we are using.
We are always working in pure Java mode.

Regards,
Marco
Flag Flag