I'm sure you can replicate all the logic in the database, but it sounds like you'll have to build a complete framework for processing files. This is a lot of work!
The big challenge is here:
* The ability to read text and binary files including multiple file formats within a single input file.
Dynamic file formats are an example of the previous requirement.
External tables are great when you know the input file structure. If you're working with a relatively small number of known record structures, it may be feasible to write routines to process each record type.I discounted the use of SQL in PL/SQL because of the number of context switches this will incur.
You're on 19.6, which means you have access to (table) SQL macros! I'm not sure these will help, but they're worth looking into: https://blogs.oracle.com/datawarehousing/sql-macros-have-arrived-in-autonomous-database
You could also check out polymorphic table functions. These allow you to create routines such as dynamic CSV-to-columns converters: https://livesql.oracle.com/apex/livesql/file/content_F99JG73Z169WENDTTQFDQ0J09.html
But as I say, this will (probably) only be worth it if you're dealing with few record types. If you're dealing with a large number of record types and/or you have to process new record types with little (no!) warning you'll want to build a complete dynamic framework.
Remember that your time costs the company money too! In my experience, many companies avoid buying software because it's "too expensive" and build it themselves... only to spend significantly more in salaries for developers, testers, etc. to get the same result.