Smart M.App discussions

Looking for answers in regards to M.Apps, M.App Exchange or M.App Studio? Smart M.App Tech Discussion board is where you can search, ask your questions and assist others by sharing your knowledge. Join the conversation, connect, contribute and share.
Showing results for 
Search instead for 
Do you mean 
Posts: 59
Registered: ‎04-19-2016
Accepted Solution

Transform .shp attribute data into CSV format suitable for time-series BI Map



Has someone got a fast way to assist in tranforming attribute data in a shapefile into .csv file for my BI Map?


At the moment the data is as follows


GEOID, July, August, September, October, November, December, January, February, March, April, May, June


And I need my data in .csv as:


GEOID, Month-Yr, Value


Any help would be greatly appreciated.



Technical Evangelist
Posts: 1,307
Registered: ‎09-11-2015

Re: Transform .shp attribute data into CSV format suitable for time-series BI Map

Hi Jens,


I think you will need to load / export the data into some database or Excel file and use UNPIVOT functionality.







Jan Neumann
Post Sales Engineer Web Applications
Hexagon Geospatial

Frequent Contributor
Posts: 106
Registered: ‎06-03-2016

Re: Transform .shp attribute data into CSV format suitable for time-series BI Map

Hi Jens, 



Were you looking for a solution limited to spatial modeller (SM) operators? If not, the already suggested solutions are the fastest approaches so quickly move on to the next post, but if you're looking to implement this in SM, here are some ideas.


Over here, Ian used the Table File Output operator to output a single attribute column and it satisfies your 'fast' requirement. The beauty of the Table operators is that they're straightforward and fast. To achieve your pivot, you'll need to use a series of Select Attributes; each of which yields 1 record. The round-trip through CSV may not be required in this case and you might be able to append these records in a list of Dictionary items. If you had to write a file, you would have to export each column to its own text file (you can later read each column of data in using the Table File Input operator).


RE Data Containers:


I learned that in ERDAS "tables" are really just "columns," e.g., list. Matrices support multiple columns, but (of course) are limited to numerical data. Therefore there really isn't a container to store your layer's attributes. You can't use matrices as a container for non-numeric attributes and you can't use tables as a container for your layer's attributes unless you only had 1 column of data. Even if you had multiple columns of numerical data and used Larry's wizardry to store your layer attributes into a matrix, the Table File Input/Output operators won't respect the boundary between each matrix column and would output/input 1 column records as column1column2 (no delimiter in between your column data. If you had coordinate attributes, for example, you could use a three column matrix and store your content in columns 2-3, which results in a space delimited list of ordinates). Bottom line, tables are a no go.


The Data Input/Output operators respect the column boundaries in matrices and you have quite a few options in organizing your data. However, constructing this dataset will kill your 'fast' requirement. You could iteratively append attribute values to lists, for each column, and then append each attribute-column's list into a higher level object like a Dictionary (right-click and Add Ports as needed to accommodate the number of columns you have). This lists-of-lists approach will allow you to store text *and* numbers in a pseudo-tabular form. If you needed to write a file, use the Data IO operators to write/read a JSON file (not CSV). Note that the Report operator will produce less structured output, where each column's data is written out sequentially so you'll have more work to do when reading that data back. Anyhow, I suppose the silver lining with this approach is you're in control of how to structure your data. 


Other Options: 


Feature layers are a static data source. Perhaps an alternative would be to use some external process (e.g., ArcGIS export to CSV) to supply that CSV to your Smart M.App. You could use the SplitString operator to then chop each CSV record into fields.


You could also push your feature layer through a WMS service and use the WMS API to query for attributes. That approach will require quite a bit more JS.





Posts: 59
Registered: ‎04-19-2016

Re: Transform .shp attribute data into CSV format suitable for time-series BI Map

Hi Noel,


Thanks for your lengthy explanation. I really appreciate it.


For now I will use the query -> transform route to expedite my testing however I am definitely looking for a novel solution using spatial operators in the modelling environment.


I will continue to research and post if I come up with a nice logical model to convert my data, and please do the same if you beat me to it.