programing tip

node.js는 시스템 명령을 동 기적으로 실행합니다

itbloger 2020. 6. 2. 08:27
반응형

node.js는 시스템 명령을 동 기적으로 실행합니다


node.js 함수 가 필요 합니다

result = execSync('node -v');

주어진 명령 줄 동기식으로 실행하고 해당 명령 텍스트로 표시된 모든 표준을 반환합니다.

추신. 동기화가 잘못되었습니다. 알아. 개인적인 용도로만 사용하십시오.

최신 정보

이제 종료 코드를 제공하지만 stdout은 제공하지 않는 mgutz의 솔루션이 있습니다! 더 정확한 답변을 기다리는 중입니다.

최신 정보

mgutz는 그의 대답을 업데이트했고 해결책은 여기에 있습니다 :)
또한 dgo.a에서 언급했듯이 독립형 모듈 exec-sync가 있습니다

2014-07-30 업데이트

ShellJS lib가 도착했습니다. 이것이 현재 최선의 선택이라고 생각하십시오.


업데이트 2015-02-10

마지막에! NodeJS 0.12는 execSync기본적으로 지원합니다 .
공식 문서를 참조하십시오


Node.js (버전 0.12부터-잠시 동안)는 execSync다음을 지원합니다 .

child_process.execSync(command[, options])

이제 직접 할 수 있습니다 :

const execSync = require('child_process').execSync;
code = execSync('node -v');

그리고 당신이 기대하는 것을 할 것입니다. (i / o 결과를 상위 프로세스로 파이프하는 기본값) 참고 수행 할 수도 있습니다 spawnSync지금.


execSync 라이브러리를 참조하십시오 .

node-ffi로하 는 것은 상당히 쉽습니다 . 서버 프로세스에는 권장하지 않지만 일반적인 개발 유틸리티의 경우에는 작업이 완료됩니다. 라이브러리를 설치하십시오.

npm install node-ffi

스크립트 예 :

var FFI = require("node-ffi");
var libc = new FFI.Library(null, {
  "system": ["int32", ["string"]]
});

var run = libc.system;
run("echo $USER");

[편집 2012 년 6 월 : STDOUT를 얻는 방법]

var lib = ffi.Library(null, {
    // FILE* popen(char* cmd, char* mode);
    popen: ['pointer', ['string', 'string']],

    // void pclose(FILE* fp);
    pclose: ['void', [ 'pointer']],

    // char* fgets(char* buff, int buff, in)
    fgets: ['string', ['string', 'int','pointer']]
});

function execSync(cmd) {
  var
    buffer = new Buffer(1024),
    result = "",
    fp = lib.popen(cmd, 'r');

  if (!fp) throw new Error('execSync error: '+cmd);

  while(lib.fgets(buffer, 1024, fp)) {
    result += buffer.readCString();
  };
  lib.pclose(fp);

  return result;
}

console.log(execSync('echo $HOME'));

ShellJS 모듈을 사용하십시오 .

콜백을 제공하지 않고 exec 함수.

예:

var version = exec('node -v').output;

There's an excellent module for flow control in node.js called asyncblock. If wrapping the code in a function is OK for your case, the following sample may be considered:

var asyncblock = require('asyncblock');
var exec = require('child_process').exec;

asyncblock(function (flow) {
    exec('node -v', flow.add());
    result = flow.wait();
    console.log(result);    // There'll be trailing \n in the output

    // Some other jobs
    console.log('More results like if it were sync...');
});

This is not possible in Node.js, both child_process.spawn and child_process.exec were built from the ground up to be async.

For details see: https://github.com/ry/node/blob/master/lib/child_process.js

If you really want to have this blocking, then put everything that needs to happen afterwards in a callback, or build your own queue to handle this in a blocking fashion, I suppose you could use Async.js for this task.

Or, in case you have way too much time to spend, hack around in Node.js it self.


This is the easiest way I found:

exec-Sync: https://github.com/jeremyfa/node-exec-sync
(Not to be confused with execSync.)
Execute shell command synchronously. Use this for migration scripts, cli programs, but not for regular server code.

Example:

var execSync = require('exec-sync');   
var user = execSync('echo $USER');
console.log(user);

Just to add that even though there are few usecases where you should use them, spawnSync / execFileSync / execSync were added to node.js in these commits: https://github.com/joyent/node/compare/d58c206862dc...e8df2676748e


You can achieve this using fibers. For example, using my Common Node library, the code would look like this:

result = require('subprocess').command('node -v');

I get used to implement "synchronous" stuff at the end of the callback function. Not very nice, but it works. If you need to implement a sequence of command line executions you need to wrap exec into some named function and recursively call it. This pattern seem to be usable for me:

SeqOfExec(someParam);

function SeqOfExec(somepParam) {
    // some stuff
    // .....
    // .....

    var execStr = "yourExecString";
    child_proc.exec(execStr, function (error, stdout, stderr) {
        if (error != null) {
            if (stdout) {
                throw Error("Smth goes wrong" + error);
            } else {
                // consider that empty stdout causes
                // creation of error object
            }
        }
        // some stuff
        // .....
        // .....

        // you also need some flag which will signal that you 
        // need to end loop
        if (someFlag ) {
            // your synch stuff after all execs
            // here
            // .....
        } else {
            SeqOfExec(someAnotherParam);
        }
    });
};

I had a similar problem and I ended up writing a node extension for this. You can check out the git repository. It's open source and free and all that good stuff !

https://github.com/aponxi/npm-execxi

ExecXI is a node extension written in C++ to execute shell commands one by one, outputting the command's output to the console in real-time. Optional chained, and unchained ways are present; meaning that you can choose to stop the script after a command fails (chained), or you can continue as if nothing has happened !

Usage instructions are in the ReadMe file. Feel free to make pull requests or submit issues!

EDIT: However it doesn't return the stdout yet... Just outputs them in real-time. It does now. Well, I just released it today. Maybe we can build on it.

Anyway, I thought it was worth to mention it.


you can do synchronous shell operations in nodejs like so:

var execSync = function(cmd) {

    var exec  = require('child_process').exec;
    var fs = require('fs');

    //for linux use ; instead of &&
    //execute your command followed by a simple echo 
    //to file to indicate process is finished
    exec(cmd + " > c:\\stdout.txt && echo done > c:\\sync.txt");

    while (true) {
        //consider a timeout option to prevent infinite loop
        //NOTE: this will max out your cpu too!
        try {
            var status = fs.readFileSync('c:\\sync.txt', 'utf8');

            if (status.trim() == "done") {
                var res = fs.readFileSync("c:\\stdout.txt", 'utf8');
                fs.unlinkSync("c:\\stdout.txt"); //cleanup temp files
                fs.unlinkSync("c:\\sync.txt");
                return res;
            }
        } catch(e) { } //readFileSync will fail until file exists
    }

};

//won't return anything, but will take 10 seconds to run
console.log(execSync("sleep 10")); 

//assuming there are a lot of files and subdirectories, 
//this too may take a while, use your own applicable file path
console.log(execSync("dir /s c:\\usr\\docs\\"));

EDIT - this example is meant for windows environments, adjust for your own linux needs if necessary


I actually had a situation where I needed to run multiple commands one after another from a package.json preinstall script in a way that would work on both Windows and Linux/OSX, so I couldn't rely on a non-core module.

So this is what I came up with:

#cmds.coffee
childproc = require 'child_process'

exports.exec = (cmds) ->
  next = ->
    if cmds.length > 0
      cmd = cmds.shift()
      console.log "Running command: #{cmd}"
      childproc.exec cmd, (err, stdout, stderr) ->
        if err? then console.log err
        if stdout? then console.log stdout
        if stderr? then console.log stderr
        next()
    else
      console.log "Done executing commands."

  console.log "Running the follows commands:"
  console.log cmds
  next()

You can use it like this:

require('./cmds').exec ['grunt coffee', 'nodeunit test/tls-config.js']

EDIT: as pointed out, this doesn't actually return the output or allow you to use the result of the commands in a Node program. One other idea for that is to use LiveScript backcalls. http://livescript.net/

참고URL : https://stackoverflow.com/questions/4443597/node-js-execute-system-command-synchronously

반응형