Hi everyone,
I have some scripts that uploaded some important files to the cloud on router's restart. Of course I had known_hosts that were uploaded to ~/.ssh/ and had no requests for trusted hosts for automation process.
But starting from 384.12 I started getting errors of upload.
Copying to ~/.ssh gives an error of read-only file system, if it's done in starting scripts. Going to deeper debuging I've found out, that even if I copy directly to /home/root/.ssh/, I still getting the following error for scp command in starting script (it's important, that it's called from the scripts)
Which looks like Dropbear is also trying to write to ~/.ssh/know_hosts with no luck.
But if I repeat the same scp action from CLI (after logging in), the problem disappears, and known_hosts is accepted, which means that ~ direction is recognized.
Any idea of solution is very appreciated.
I have some scripts that uploaded some important files to the cloud on router's restart. Of course I had known_hosts that were uploaded to ~/.ssh/ and had no requests for trusted hosts for automation process.
But starting from 384.12 I started getting errors of upload.
Copying to ~/.ssh gives an error of read-only file system, if it's done in starting scripts. Going to deeper debuging I've found out, that even if I copy directly to /home/root/.ssh/, I still getting the following error for scp command in starting script (it's important, that it's called from the scripts)
Code:
--- script.sh---
scp -i id_rsa -P 2222 /tmp/log.log zperetz@upload.local:~
-----------------
/usr/bin/dbclient: Warning: failed creating //.ssh: Read-only file system
Host 'upload.local' is not in the trusted hosts file.
But if I repeat the same scp action from CLI (after logging in), the problem disappears, and known_hosts is accepted, which means that ~ direction is recognized.
Any idea of solution is very appreciated.