Control validation of TSA and OCSP certificate in TElX509CertificateValidator
Implement option to control validation of TSA and OCSP certificate.
Example: XAdES-T signature. When validating, validation process validates signing, OCSP, TSA certificate and chain of all of them. Chain validation can be skipped but there is no option of skipping validation of OCSP and TSA cert.
Add support for RSA-OAEP encryption scheme using non exportable certificates.
The problem is that the data is encrypted using RSA-OAEP encryption scheme, but the OAEP padding for non-exportable certificates is not supported at the moment. So the component tries to emulate OAEP padding and extract private key, and it fails. Note: only the latest Windows do support OAEP padding by Cryptographic Service Providers.
Add compression to the S/MIME class as per RFC 5751
S/MIME is currently defined by RFC 5751 (S/MIME v3.2). The primary difference is a new CompressedData content type. The sender can compress the message and attachments at message compose time, then encrypt the result. Once encrypted, it is essentially impossible to compress a message, as compression depends on patterns which are destroyed by encryption. Upon receipt, the message is decrypted and then the compressed message expanded.
The intent is to reduce the size of mail messages, and hence the time required for transmission.
Support TPM Key Attestation during certificate generation
Key Attestation is a Trusted Platform Module feature that enables the TPM to confirm that the private key is stored within it and is not usable outside the TPM. This is used to ensure that there is only one PC that holds a private key (ensures a unique identity).
It would be great if SecureBlackbox supported certificate request generation that generated the key using the TPM and invoked the Key Attestation feature to attest this in the certificate signing request.
Here is a potentially useful link showcasing some other TPM features that developers are interested in using but that existing security software implementations fail to provide: https://stackoverflow.com/questions/28862767/how-to-encrypt-bytes-using-the-tpm
Maybe good opportunity for SecureBlackbox to provide some exclusive features here.
I am currently writing generic code that sync between different file storage systems. The storage systems are implemented as interfaces so that I can sync between any 2 storage systems. For example to sync between Dropbox and local file system or sync between 2 FTP sites.
Now to Eldos Cloud Storage. You have already created an abstract base to generalize between storage systems. I think it would be a nice addition to provide a sync function in the same way I described above.
In a broader view however, your cloud storage does not implement everything. I think a good approach would be to use a simple interface for syncing rather than inheritance. This would make it easier to add other user defined storage.
Add FTP and SFTP to CloudBlackBox in order to have a unified method of access for all file systems.
The use case is in b2b file transmissions. Many businesses are small companies that build data files in excel manually. Many desire drop box over FTP. By including FTP in CloudBlackBox, it will allow rewriting server code so that it supports current FTP users as well as utilizing cloud storage.
Currently objects are written to Azure in page mode, which gives slightly larger size and format incompatible with other software. Block mode would let other software use the original data, written with CloudBlackbox.
Object expiration of S3 bucket via LifeCycle property access
Add a Set/Get to TElAWSS3DataStorageBucket to manipulate a bucket's lifecycle configuration properties. In particular, I'm after the ability to set the # of days that the objects auto-delete in order to provide automatic maintenance of particular bucket objects. (Like log files.)
HTTP/2 is a new protocol with built-in fallback to HTTP 1.1, that aims to give speed increase for browsers, that usually request several files from one server. HTTP/2 is not needed by regular applications as HTTP/2.0 servers must support HTTP 1.1 request
Implement some AEAD mode (EAX for instance) in encryption
When dealing with low-level crypto, picking an appropriate encryption mode of operation for the task becomes important. SBB currently implement several mode: ECB (which is a real risk), CBC, CTR, CFB8, GCM and CCM.
Unfortunately, none of these mode is an authenticated encryption with associated data (AEAD) mode of operation which leaves application responsible for authenticating data by a separate channel (typically, by supplying an IV manually and then storing the result of a HMAC directly in the message) which results in more code, less compatibility and more complexity (and could lead to bigger messages as well).
Implementing at least one of the modern block cipher mode would remove the necessity to implement that code.
My preferred mode for this would be EAX since it has many desirable properties and isn't linked to any patent but other modes could be considered as well in order to improve compatibility (see http://csrc.nist.gov/groups/ST/toolkit/BCM/modes_development.html#01 for a list and detail of currently considered AEAD modes).
Implement partial object uploads in cloud object storage objects (Azure, S3). The Azure API, for example, allows for upload of 512-byte aligned blocks (pages), if only 1KB of a 100 MB file stored in the cloud is changed, it would be helpful to be able to upload just that 1KB instead of having to reupload the entire file.
Support asynchronous methods in communication API.
There should be asynchronous versions of communication methods such as connect/send/receive, so that we don't have to spawn a new thread just to wait for the reply from the server when this can be done more efficiently via overlapped IO etc. Preferrably with cancellationsupport the same way that the various .NET 4.5 Async methods work, e.g. System.Net.Sockets.Socket.ReceiveAsync.
Shamir's Secret Sharing is an algorithm in cryptography. It is a form of secret sharing, where a secret is divided into parts, giving each participant its own unique part, where some of the parts or all of them are needed in order to reconstruct the secret.
Card Verifiable Certificates (CVC) are digital certificates that are designed to be processed by devices with limited computing power such as smart cards. This is achieved by using simple TLV ( Tag Length Value) encoding with fixed fields. Fixed fields means that each field in the certificate is of fixed, or maximum, length and each field comes in a well defined order. This makes parsing easy, in contrast to asn.1 parsing which requires more processing and has to keep fields in memory while parsing nested content.
CVC is used by the third generation ePassports implementing Extended Access Control (EAC).
V8 script engine and Node.JS are more powerful and potentially can be used for security operations so use of SecureBlackbox in that environments probably makes more sense.
Specification is available on http://www.etsi.org/deliver/etsi_ts/102200_102299/102231/03.01.02_60/ts_102231v030102p.pdf and http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=OJ:L:2009:299:0018:0054:EN:PDF
Compression of responses is great, but in our environment we often need to POST or PUT large amounts of data to the server. Currently we're looking into compressing this data ourselves, because although it's unorthodox, we're only talking to our own server and we know we can handle it. It would be nice if SecureBlackBox could do this for us.
The PKCS#11 interface for the Smart Card contains enough functions if a smart card is used on one PC. Usually there are some problems with smart cards which are not used on one PC, because the user is not able to install the PKCS#11 drivers on another PC where the PC is in the user mode and the administrator mode is not permitted for basic PC users like officers or Internet cafe visitors. The profile of PKCS#15 which is defined in EU Norm EN 14890 (CWA 14890) Application Interface for smart cards used as Secure Signature Creation Devices - Part 1: Basic services is a perfect way in which the application is able to communicate uniquely with any smart card which has implemented APDU according to this PKCS#15 profile. It will be used in national eID cards and eHealth cards or corporate multipurpose cards. http://www.cen.eu/cen/Sectors/Sectors/ISSS/CEN%20Workshop%20Agreements/Pages/Electronic%20Signatures.aspx The PKCS#15 can also contains a secure store of trusted Root certificates what can be used in applications for simplification of PKI for basic users, because in verification process the holder of a smartcard is usually non IT expert and the application can decide automatically which root is trusted for the holder of a card without any problems. Presently the users of systems where the secure store of root certificates in PKCS#15 is not implemented are puzzled with very strange question like: is this ... (root certificate) trusted?