Below are the common error codes for most of the third-party programs on the integrations list.
The errors are list in:
Class B
Class C
Storage Caps
If you do not see the error you are getting below, please contact support.
Class B Errors:
B2_fuse:
2016-05-19 08:44:31,204:DEBUG:Open /test.txt (flags:163840)
2016-05-19 08:44:31,205:INFO:_get_file test.txt
Error returned from server:
Params: None
Headers: {'Authorization': '3_20160519157777_ 177777386747777772a08acb_ 8f4b7777777257d6772d53521616af07777777eff3_777_acct'}
{
"code": "download_cap_exceeded",
"message": "Cannot download file, download bandwidth or transaction (Class B) cap exceeded. See the Caps & Alerts page to increase your cap.",
"status": 403
}
Traceback (most recent call last):
File "/usr/lib/python2.7/site- packages/fuse.py", line 495, in _wrapper
return func(*args, **kwargs) or 0
File "/usr/lib/python2.7/site- packages/fuse.py", line 572, in open
fi.flags)
File "/usr/lib/python2.7/site- packages/fuse.py", line 800, in __call__
return getattr(self, op)(*args)
File "b2fuse.py", line 395, in open
self.open_files[path] = array.array('c',self.bucket. get_file(path))
File "/home/k77/b2_fuse/b2bucket_ cached.py", line 155, in get_file
return self._get_file(*args)
File "/home/k77/b2_fuse/b2bucket.py" , line 181, in _get_file
with OpenUrl(api_url, None, encoded_headers) as resp:
File "/home/k77/b2_fuse/b2_python_ pusher.py", line 71, in __enter__
sys.exit(1)
SystemExit: 1
HashBackup:
Error returned from server:
Params: {"bucketId": "Your_bucket_ID_Here"}
Headers: {'Authorization': u'3_20160519157777_ 177777386747777772a08acb_ 8f4b7777777257d6772d53521616af07777777eff3_777_acct'}
{
"code": "storage_cap_exceeded",
"message": "Cannot upload files, storage cap exceeded. See the Caps & Alerts page to increase your cap.",
"status": 403
}
Duplicity:
Attempt 1 failed. HTTPError: HTTP Error 403: Forbidden
Synology:
MSP360:

Cyberduck:

Class C Transaction Cap errors:
b2 python script:
ERROR: unable to authorize account: Unknown error: 403 transaction_cap Transaction cap exceeded, see the Caps & Alerts page to increase your cap
Cloudberry Backup:

Cyberduck:
Odrive:
The b2 server could not validate these credentials.
B2_Fuse

Hashbackup:
Duplicity:
Dropshare:

Neofinder:

Synology:

Also, check the storage caps limit, Synology cloud sync will keep trying to upload upon failure until the class C transactions are out, if you have a limit of $0.01 or $0.02 you may never see this error.
Storage Cap Error:
Hash Backup:
dest b2: error #1 of 3 in send arc.0.2: b2(b2): http status 403 (Cannot upload files, storage cap exceeded. See the Caps & Alerts page to increase your cap.) getting upload url
dest b2: retry #1 in 5 seconds for send arc.0.2
dest b2: error #2 of 3 in send arc.0.2: b2(b2): http status 403 (Cannot upload files, storage cap exceeded. See the Caps & Alerts page to increase your cap.) getting upload url
dest b2: retry #2 in 10 seconds for send arc.0.2
dest b2: error #3 of 3 in send arc.0.2: b2(b2): http status 403 (Cannot upload files, storage cap exceeded. See the Caps & Alerts page to increase your cap.) getting upload url
dest b2: stopping destination because of errors
dest b2: Traceback (most recent call last):
File "/basedest.py", line 327, in loop
File "/basedest.py", line 412, in sendcmd
File "/basedest.py", line 68, in retry
File "/b2dest.py", line 484, in sendfile
File "/b2dest.py", line 174, in b2api
File "/b2dest.py", line 146, in _uploadauth
Exception: b2(b2): http status 403 (Cannot upload files, storage cap exceeded. See the Caps & Alerts page to increase your cap.) getting upload url
Duplicity:
duplicity full ~/Documents/ b2://d9fff999997b: 001b99999992d3aff5edabfceeeae 57b45eee4f74@Testing/ duplicity
Local and Remote metadata are synchronized, no sync needed.
Last full backup date: none
GnuPG passphrase:
Retype passphrase to confirm:
Attempt 1 failed. HTTPError: HTTP Error 403: Forbidden
Attempt 2 failed. HTTPError: HTTP Error 403: Forbidden
Attempt 3 failed. HTTPError: HTTP Error 403: Forbidden
Attempt 4 failed. HTTPError: HTTP Error 403: Forbidden
B2_Fuse:
Error returned from server:
Params: {"bucketId": "1dc92b5dec998feeef020a1b"}
Headers: {'Authorization': u'3_20160519153250_ f75bedbc7f59eeeee3444972_ 419eb852ac6eeee38ea8e7159d399eee98c66ee2_001_acct'}
{
"code": "storage_cap_exceeded",
"message": "Cannot upload files, storage cap exceeded. See the Caps & Alerts page to increase your cap.",
"status": 403
}
Dropshare:

Cyberduck:

Synology:
Currently, it seems that Synology will keep re-trying to upload files if you are out of space, with no error, and it will keep chipping away at your class C transactions. So if you are out of Class Transaction please look at the storage caps as well.
MSP360:

Articles in this section
- Veeam VM Recovery from Backblaze B2
- Relinking MSP360 Backups After a Cloud to B2 Migration
- How to use Synology Hyper Backup with Backblaze B2 Cloud Storage and B2 Fireball Rapid Ingest
- Veeam Cloud Repository Recovery from Backblaze B2
- Copying Active Backup data to Backblaze B2 with Hyper Backup
- How to use XenData Gateway with B2
- Configuring Storage Made Easy for use with B2
- Setting up Marquis Broadcast for use with B2
- How to use Masstech with B2
- How to use Spectra StorCycle with B2