Jump to content
Hash, Inc. Forums

Techwatch: Image formats


Rodney

Recommended Posts

  • Admin

This tech watch focuses on one specific foray into bringing image formats up to date with modern hardware and software but there is much more to come. I hesitate to say there is a war of sorts on the horizon but some negotiations are ongoing to determine which approaches get adopted and what settles in for the long term.

 

Microsoft has initiated their roll out of the HEIT format.

There currently aren't any editors for the format and the focus is on playback.

This would appear to be a byproduct of Microsoft's purchase of Nokia several years ago.

 

 

For more information see:

 

http://nokiatech.github.io/heif/

 

Of interest, this format is one of many approaches that form a container around data that is then processed in a standardized/optimized way... as opposed to processing the data differently for each data type. Containers aren't anything new but this round would suggest that after studying various options in the wild were useful feedback could be collected the approach could be further standardized and optimized,

 

Disclaimer: These tech watches usually don't effect us in the short term as they point to technologies on the horizon.

Those that are aware however can take advantage.

Link to comment
Share on other sites

  • Admin

Here's the news that is making the rounds:

 

https://venturebeat.com/2018/03/16/microsoft-releases-two-new-windows-10-previews-with-high-efficiency-image-file-format/

 

 

 

One of the reasons why 'image wars' might be an appropriate classification is that many of the new approaches will leave users of older approaches out in the cold. While it's certain that some will take advantage of the newer approaches and build bridges going backward most will go the easier route and build bridges going forward. What this means is that to take advantage of the modern architecture PC users will want to stay current with Windows 10. This is the ongoing effect of Windows as a service.

Link to comment
Share on other sites

  • Admin

It appears some on Mac platforms may have already been using the HEIF formatting:

 

Here's an article from September 2017:

 

https://www.cultofmac.com/487808/heif-vs-jpeg-image-files/

 

At the end of that article is a link to yet another article entitled "HEIF: A first nail in JPEG's coffin?":

 

https://iso.500px.com/heif-first-nail-jpegs-coffin/

Link to comment
Share on other sites

  • Admin

Nokia's javascript HEIF library appears to still be maintained on github:

 

https://github.com/nokiatech/heif/tree/gh-pages/js

 

 

One thing that isn't clear to me at this point is how encumbered HEIF might be with patents.

HEVC was know to be highly tied down with patents.

 

This license appears to be very current: https://github.com/nokiatech/heif/blob/master/LICENSE.TXT

Formatting updated as of 9 days ago.

Link to comment
Share on other sites

  • Admin

I had heard the term before but that didn't bring anything of use to my understanding of the term.

It would seem that we missed a lot of new terms being rolled out in a similar vein:

 

https://pc.net/helpcenter/answers/kibibytes_mebibytes_and_gibibytes

 

That article was from 2005... and the measurements were placed into effect back in 1998 so we are definitely behind the power curve.

 

From the (brief) article:

 

The size of computer data is measured in bytes. Larger units of bytes are often measured in kilobytes, megabytes, and gigabytes. However, the size of these units can be somewhat ambiguous. For example, a kilobyte can equal 1,024 bytes or 1,000 bytes, depending on the context in which it is used. A megabyte may equal 1,048,576 bytes or 1,000,000 bytes.

In 1998, the International Electrotechnical Commission (IEC) introduced new units of measurement to avoid this confusion.

These units are all exact measurements and cannot be estimated like kilobytes, megabytes, and gigabytes.

 

This whole problem apparently arose because we like to round things into nice even numbers with lots of zeroes but that creates a lot of problems because in that particular context 1000 = 1024.

We could (at least theoretically) lose that additional 24 bytes in the difference isn't taken into consideration because 2^10 = 1024, not 1000.

At first blush it would appear that where we reference exact byte-wise numerations using 'bi' (instead of 'lo', 'ta', 'ga', etc.) to declare that level of accuracy would be more technically correct.

 

 

There were suits filed based on folks being convinced they were being robbed of extra bytes due to this ambiguity:

https://www.cnet.com/news/gigabytes-vs-gibibytes-class-action-suit-nears-end/

 

 

Thanks to Robert for making the mention... I learned some very interesting things today.

Link to comment
Share on other sites

  • Admin

I have read that of the difference between 1000KB and 1024KiB that 'extra' 24 bytes is allocated to the file system.

I'm not sure if this is actually the case but that would make sense and perhaps additionally explain why Windows OS has long used KiB under the hood.

In systems such as Linux were KB is used exclusively I would assume the same thing is being done just in a different way.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...