tag:blogger.com,1999:blog-17859011.post4330696360214971101..comments2024-03-29T07:11:17.775+00:00Comments on Evangelical Textual Criticism: The problem with digitizing our disciplineP.J. Williamshttp://www.blogger.com/profile/04388225485348300613noreply@blogger.comBlogger6125tag:blogger.com,1999:blog-17859011.post-21114463269664790082017-12-26T18:50:16.656+00:002017-12-26T18:50:16.656+00:00Yes, some formats have a longer life than others a...Yes, some formats have a longer life than others and HTML and XML fit that category. For now. But who wants to make a website with static HTML nowadays? And even XML has to be giving specifications for something like TEI. But certainly TEI has not been static.Peter Gurryhttps://www.blogger.com/profile/10396444437216746412noreply@blogger.comtag:blogger.com,1999:blog-17859011.post-19928197736017976202017-12-23T22:34:47.301+00:002017-12-23T22:34:47.301+00:00The angst expressed here is accompanied by some ve...The angst expressed here is accompanied by some very muddled thinking. <br /><br />There are basic formats which do not age and which do not go away. Plain text, XML, USFM for scripture, standard compliant HTML. None of that is suddenly rendered inaccessible by technological progress. It is using fancy, proprietary technologies which causes the issues described.<br /><br />And this not new. Closed source, proprietary formats and arbitrary copyright restrictions have been criticised for exactly this reason for as long as computers exist. Richard Stallmann wrote his General Public License and created Free/Libre Software movement in the 1970s to counter the risk of data being rendered inaccessible due to programmes becoming unavailable. Larry Lessig created the Creative Commons movement and its free licensing regime in 2001 in order to ensure that created works and texts remain always usable, even when the rights owners are not anymore around. OSIS XML was created in the last decade of last century in order to.create a reliable international standard for encoding Scripture, including all its variations, its apparatus etc in an Open and lasting fashion. Around the same time SIL, UBS and others agreed on USFM as another equally accessible and lasting standard for encoding Scripture, particularly translations. No text encoded in this fashion is inaccessible now nor will it ever become inaccessible. The tools are to.handle such texts are well documented, ubiquitous and often open source and freely licensed themselves. <br /><br />It is shortsightedness and unwillingness to do one's homework which leads to inaccessible websites and texts. Not technological progress.<br /><br />Peter von Kaehne<br /><br />So, th<br />Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-17859011.post-24683759792643850092017-12-22T19:08:08.365+00:002017-12-22T19:08:08.365+00:00As a former librarian here is another part of this...As a former librarian here is another part of this issue to consider apart from the technical issues – half lives of information. Books are written, read and cited, read by others who are following up citations, and then either (rarely) become a long term standard work or (commonly) become a work that had relevance for a somewhat limited period but over time becomes less cited and less read. In STEM subjects this period of relevance – the half life where half of all reading and citation of the book occurs – is very short, in the humanities it may be over a decade with a long tail off period. After a few decades the book is either weeded from collections or relegated to off site closed access storage. If later the information in the book becomes relevant again it can usually be located in a few libraries.<br /><br />In contrast to the long half life of information in humanities books, information in online items such as blog posts, forum posts, web pages and the like have a much shorter half life, often being out of date within a few days. Some of the information will be archived in multiple places but not all and much information within a decade of its creation only exists in a single computer maintained by a single person or organisation. Eventually the computer fails, or the organisation decides the information has no value to them and is not worth the cost of storage, or the person looses interest in the apparently out of date information, changes their role, retires, or dies, nobody maintains the information and it ceases to exist. If later the online material becomes relevant again – too bad, it can’t be located anywhere.<br /><br />Bob Relyea is correct that it is a good idea to have multiple and continuous copying, but who is responsible for the labour, management, testing, hardware, software, etc. to do this across the full range of online resources including many that seem at the time to be of little long term relevance? Who will continue with this responsibility not just for a few years, or a few decades, but for hundreds of years?Matthew Hamiltonhttps://www.blogger.com/profile/02382402785261355659noreply@blogger.comtag:blogger.com,1999:blog-17859011.post-63979998721276289222017-12-21T19:51:24.526+00:002017-12-21T19:51:24.526+00:00The best data preservation strategy is the same as...The best data preservation strategy is the same as it's always been: multiple and continuous copying. The internet archive (archive.org) looked into what physical formats were needed to preserve it's data long term. The best answer they came up with is 'keep multiple copies' make regular copies frequently.<br /><br />This means that while tools like the VMR or the various library websites are great, at some point we need to have multiple copies of all these things. So the VMR, Bnf, Vatican, British Library, CSTNM, etc should have copies of each others manuscripts and meta-data. Ideally the meta data should be downloadable so that may various groups can build their own tools to access it, but I think at this time libraries would be unwilling to release their copyrights on their digital copies.Bob Relyeahttps://www.blogger.com/profile/13063651264391311686noreply@blogger.comtag:blogger.com,1999:blog-17859011.post-80887852306558734632017-12-21T16:38:11.632+00:002017-12-21T16:38:11.632+00:00Link rot should be a major concern as well. What h...Link rot should be a major concern as well. What happens in 10 years when the BnF or the Vatican change all their servers and the URLs to these wonderful images changes? I know Troy Griffitts has had to deal with this with the VMR at times. So some of this is a matter of constant maintenance. But therein lies the problem. I suppose the same problem faces libraries, but print books are very low maintenance.Peter Gurryhttps://www.blogger.com/profile/10396444437216746412noreply@blogger.comtag:blogger.com,1999:blog-17859011.post-91728571518302449572017-12-21T13:48:22.405+00:002017-12-21T13:48:22.405+00:00Finally, a post on ETC that I can comment on! The ...Finally, a post on ETC that I can comment on! The phenomenon you describe is very real; many technologies are becoming obsolete in a hurry. The risk of that happening to very popular file formats such as Word or PDF documents are somewhat smaller; there is usually backward compatibility built in newer versions of MS Office - but how long will MS Office be around? Who knows? Probably decades, but there's no way of knowing for sure. Yet, I think that you'll be able to use those file formats for a long time to come.<br /><br />Also an issue: 'link rot'. What happens if you link to someones' online article in your blog post, and that person removes their online content some time later? The link in your blog post still exists, but goes to.. nothing. That could even happen if the technology behind popular web sites changes (say, www.codexsinaiticus.com switches to a more modern viewer).<br /><br />Technology is a wonderful enabler of things that might otherwise not be possible (or very hard to do), such as CBGM, but it is not without its own risks. Anonymousnoreply@blogger.com