Are you happy with your logging solution? Would you help us out by taking a 30-second survey? Click here

asar

Simple extensive tar-like archive format with indexing

Star full 4f7b624809470f25b6493d5a7b30d9b9cb905931146e785d67c86ef0c205a402Star blank 374f33e4d622a2930833db3cbea26b5d03dc44961a6ecab0b9e13276d97d6682Star blank 374f33e4d622a2930833db3cbea26b5d03dc44961a6ecab0b9e13276d97d6682Star blank 374f33e4d622a2930833db3cbea26b5d03dc44961a6ecab0b9e13276d97d6682Star blank 374f33e4d622a2930833db3cbea26b5d03dc44961a6ecab0b9e13276d97d6682 (1 ratings)
Rated 1.0 out of 5
Subscribe to updates I use asar


Statistics on asar

Number of watchers on Github 950
Number of open issues 45
Average time to close an issue about 1 month
Main language CoffeeScript
Average time to merge a PR 5 days
Open pull requests 4+
Closed pull requests 7+
Last commit over 1 year ago
Repo Created about 5 years ago
Repo Last Updated over 1 year ago
Size 169 KB
Organization / Authorelectron
Contributors10
Page Updated
Do you use asar? Leave a review!
View open issues (45)
View asar activity
View on github
Fresh, new opensource launches 🚀🚀🚀
Trendy new open source projects in your inbox! View examples

Subscribe to our mailing list

Evaluating asar for your project? Score Explanation
Commits Score (?)
Issues & PR Score (?)

asar - Electron Archive

Travis build status AppVeyor build status dependencies npm version

Asar is a simple extensive archive format, it works like tar that concatenates all files together without compression, while having random access support.

Features

  • Support random access
  • Use JSON to store files' information
  • Very easy to write a parser

Command line utility

Install

$ npm install asar

Usage

$ asar --help

  Usage: asar [options] [command]

  Commands:

    pack|p <dir> <output>
       create asar archive

    list|l <archive>
       list files of asar archive

    extract-file|ef <archive> <filename>
       extract one file from archive

    extract|e <archive> <dest>
       extract archive


  Options:

    -h, --help     output usage information
    -V, --version  output the version number

Excluding multiple resources from being packed

Given:

    app
(a)  x1
(b)  x2
(c)  y3
(d)   x1
(e)   z1
(f)       x2
(g)  z4
(h)      w1

Exclude: a, b

$ asar pack app app.asar --unpack-dir "{x1,x2}"

Exclude: a, b, d, f

$ asar pack app app.asar --unpack-dir "**/{x1,x2}"

Exclude: a, b, d, f, h

$ asar pack app app.asar --unpack-dir "{**/x1,**/x2,z4/w1}"

Using programatically

Example

var asar = require('asar');

var src = 'some/path/';
var dest = 'name.asar';

asar.createPackage(src, dest, function() {
  console.log('done.');
})

Please note that there is currently no error handling provided!

Transform

You can pass in a transform option, that is a function, which either returns nothing, or a stream.Transform. The latter will be used on files that will be in the .asar file to transform them (e.g. compress).

var asar = require('asar');

var src = 'some/path/';
var dest = 'name.asar';

function transform(filename) {
  return new CustomTransformStream()
}

asar.createPackageWithOptions(src, dest, { transform: transform }, function() {
  console.log('done.');
})

Using with grunt

There is also an unofficial grunt plugin to generate asar archives at bwin/grunt-asar.

Format

Asar uses Pickle to safely serialize binary value to file, there is also a node.js binding of Pickle class.

The format of asar is very flat:

| UInt32: header_size | String: header | Bytes: file1 | ... | Bytes: file42 |

The header_size and header are serialized with Pickle class, and header_size's Pickle object is 8 bytes.

The header is a JSON string, and the header_size is the size of header's Pickle object.

Structure of header is something like this:

{
   "files": {
      "tmp": {
         "files": {}
      },
      "usr" : {
         "files": {
           "bin": {
             "files": {
               "ls": {
                 "offset": "0",
                 "size": 100,
                 "executable": true
               },
               "cd": {
                 "offset": "100",
                 "size": 100,
                 "executable": true
               }
             }
           }
         }
      },
      "etc": {
         "files": {
           "hosts": {
             "offset": "200",
             "size": 32
           }
         }
      }
   }
}

offset and size records the information to read the file from archive, the offset starts from 0 so you have to manually add the size of header_size and header to the offset to get the real offset of the file.

offset is a UINT64 number represented in string, because there is no way to precisely represent UINT64 in JavaScript Number. size is a JavaScript Number that is no larger than Number.MAX_SAFE_INTEGER, which has a value of 9007199254740991 and is about 8PB in size. We didn't store size in UINT64 because file size in Node.js is represented as Number and it is not safe to convert Number to UINT64.

asar open issues Ask a question     (View All Issues)
  • about 3 years Question: Is there a reason for unpack using absolute path while unpackDir uses glob relative to <dir>?
  • about 3 years asar fs.readFileSync() fails with Error: ENOENT on macOs Sierra
  • about 3 years Path too long issues on Windows
  • about 3 years npm linked packages problem
  • over 3 years CoffeeScript -> ES6
  • over 3 years --unpack-dir seems to have different behaviors on OS X and Windows
  • over 3 years --unpack-dir does not unpack symlinks
  • over 3 years (Windows Only) Cannot recognized `\\?\` prefix in unpacked path
  • over 3 years Allow specifying unpacked directory location
  • over 3 years A ".cache" folder ?
  • over 3 years out of space ENOSPC
  • over 3 years LSOpenURLsWithRole() failed with error -10810
  • over 3 years Also note that it requires the '-g' argument for terminal/cmdline usage.
  • over 3 years Invalid package name.asar at invalidArchiveError
  • almost 4 years Dependency calling chdir
  • almost 4 years Using asar programmatically in electron
  • almost 4 years Pack with MAC OS and extract with Windows 8.
  • about 4 years Add Encryption Feature
  • about 4 years Add the ability to unpack multiple folders.
  • over 4 years fails with an error "No such file or directory"
  • over 4 years When using --unpack, the directory info is not updated
  • over 4 years Error while executing asar archive
  • over 4 years fs.createWriteStream with path that includes dots(.) failed in asar
  • over 4 years Support for piped data
  • almost 5 years Changes to the asar format? Magic number, checksum, filesize
  • about 5 years Also put size of archive at the end of file
  • about 5 years Support file exclusion/inclusion
asar open pull requests (View All Pulls)
  • UnpackDir=xyz/{123,abc} pattern didn't match UnpackDir=xyz/123 and UnpackDir=xyz/abc combined
  • fix: escape src dir while crawling fs
  • Making asar compatible with electron + adding functions to remove cache
  • Do not return before write stream fully closes
asar questions on Stackoverflow (View All Questions)
  • How to serve static files from an asar archive
  • Unable extract asar file
  • NSIS Failed opening file (.ASAR)
  • Packaging app With Electron and Asar
  • How can I hide the source code of a nodeJS solution built with Electron (asar files)?
  • Pass a packed JS file within an asar package to a spawned node child process
  • Where Can I Find Envisat Asar Dataset or SAR Geotiff Dataset for Oil Spill Detection?
asar list of languages used
Other projects in CoffeeScript