Export formats - JSON

Philip Balister philip at balister.org
Wed Jun 29 23:19:49 BST 2016


Yeah top posting is evil ...

So I think we are missing the underlying reason Philip is interested in
a standard format for storing survey data.

Whenever someone comes along and says "I can make a better cave survey
program", they start by reinventing wheels. Now, survex has pretty much
solved the data processing problem and is used by cavewhere and therion.
(on my wishlist, have subsurface export an API so people do not take the
code and hack at it)

But we still have the problems of data from things like smaps, compass,
walls etc. (I'm assuming everyone on this list understands the evils of
binary only programs)

What need is to start moving to a standard extensible format that is
easily usable without everyone having to write yet another parser for
processing cave data. If we can get this format in place and hopefully
adopted, people can process data with the program that makes them happy,
without annoying other people who like other programs.

Ideally, someone would write something translate svx files into metacave
for people that like the survex format. Possibly even build this into
survex, since it already knows how to parse svx, and then use a library
to write out the data as metacave.

Now that we have a standard format to store data in, people interested
in processing it do not need to sit down and come up with a storage
format when then have some crazy idea to do with cave data. The end goal
is get more people interested in processing cave data, using a standard
format so we can run our data sets easily with multiple programs. The
more people we can interest in working software, no mater what your
personal favourite package the healthier the cave survey software ecosystem.

Philip

On 06/24/2016 05:13 AM, Andy Waddington (Cave Surveying mailbox) wrote:
> Sometime before sending, Philip Schuchardt typed (and on Thursday 2016-06-23 at 13:40:26 sent):
> 
>>  Do you ever modify your KML or SVG files by hand? They're both plain
>> text XML documents.
> 
> Well, I modify .gpx files in a text editor (without syntax highlighting)
> as a matter of routine because it is plain text xml and editing it is dead
> easy. I have also been known to edit those same tracklogs in a mapping
> application with a graphical interface. In some respects it is more
> intuitive, but typically more time-consuming than the hand-editing.
> Often I will pick points in the graphical application to identify which
> bits of the xml to hack in the text editor which is often the most
> efficient way to accomplish my goals.
> 
> The point is that all these ways are available to me because xml, at
> least as used in .gpx files, is sufficiently well structured and readable
> that it is perfectly possible to edit in plain text. Not everyone will
> want to, but the aim should be to make it possible to do so, since
> then you can fix niggly little errors or add experimental features
> without needing full support from your GUI application.
> 
> It's a whole lot easier than editing low-level binary drawing
> files with a hex editor :-) And xml is a lot easier to hack than
> (in another application entirely) GEDCOM, which is a structured
> file format which looks easy to edit in plain text, but which is
> incredibly error-prone owing to the cross-references to lines
> very far away in the file.
> 
> So yes, xml is desirable fr its ease of editing (and there are
> indeed editors designed to edit it and maintaining its
> structural integrity - a bit more to learn but probably easier
> than plain text if you do a lot of it).
> 
> And for those document formats Martin mentions like TeX,
> plain text is great. I've always preferred What You See Is A
> Description Of What You Will Eventually Get editors to the
> newer WYSIWYG ones. It is so much easier to write little
> scripts to make changes. Even direct hacking of PostScript
> is easier than using some of the bloated desktop DTP
> files...
> 
> Andy
> 
> 



More information about the Survex mailing list