10-03-2016 09:29 PM
Logging is to developers what metrics is to business intelligence - it provides useful insight into what's happening. Out of the box, your option is to either concat a massive amount of text during model execution or sprinkle Report operators throughout so you can print to the session console (which will include timestamps, type information, and your values). I tried home brew and deisgn matters.
My initial design was a "logging bus" that ran across the top of my model and that was simply a seris of Cat operators temrinated by Report. The problems with that design are (1) you only get a log-file if the model completes successfully (i.e., not incremental) and (2) is messy and tends to muddy the water - especially if you have to do a whole bunch of string processing to get rid of data-type markup.
The subsequent design addressed these issues by encapsulating type conversions so your output is clean and this Logging "operator" appends, so you can log as your the model runs. It also adds carriage returns after Labels and Tables (how you ask? have a look.). Model execution is arbitrary, so you'll probably have to pair this sub-model with IfElse operators to force ERDAS to run it before bumbling along.
--- Write float as decimal --- 0.00000000867169 String Value: Hello World === Pretty Table === 0.8654 1.162851218 1.460302436 1.757753654 2.055204872 2.35265609 2.650107308 2.947558526 3.245009744 3.542460962 Input Filename = c:/test/in\26202.tif
Here's what it looks like under the hood:
* The only truely required ports are LogFileName and Label + any one of the other ports.
* Mind the port names as they tell you what types they'll accept. E.g., tblfltValue accepts Tables and Floats. sci2numValue will render scientific notation as an integer or decmal.