Chemical Engineering MagazineChementator Briefs
‘Green’ steelmaking Seamlessly following the successful completion of the GrInHy…
Focus on Analyzers
A proportional level-output detector for pilot plants The Dynatrol CL-10GP…

ACHEMA 2015: Process development - New challenges demand new processes

By Dechema |

From the lab to the industrial process can be a long way - simulation tools and modern methodoly help to shorten it

Despite their innate conservatism, the chemical, pharmaceutical, oil and gas industries are always looking for new products that are more effective than those they replace, or cheaper, more environment-friendly routes to existing products. Developing a new process can take up to a decade and requires co-ordination between professionals from several different cultures, often spread across the globe. New computer simulation techniques work alongside traditional scientific and engineering creativity to develop safe, economic and sustainable manufacturing processes.

Process development bridges the gap between laboratory test-tubes and commercial-scale plants.The substance being manufactured could be a bulk chemical to be shipped in thousands of tonnes, or a biopharmaceutical made on a kilogramme scale.

If the product is new to the market, the aim is to develop a process that is acceptable in terms of product quality; plant safety and environmental impact; capital and operating costs; and commercial risk, including time to market.

If the product is well-established, the developers will typically strive for a process that out-performs the established route. They may also wish to bypass intellectual property (IP) restrictions that tie a particular process to a competitor, assuming the product itself is not patented.

Process development is grounded firmly in science and engineering, but its broad scope, multi-disciplinary nature and creative content require a lot of management skills too.

Oil refiners and manufacturers of bulk chemicals face huge pressures on cost and reliability, so they need to get their process designs right first time. Drugs manufacturers have the extra constraint that pharmaceutical processes are very hard to change once they have been validated by the regulators. Between these size extremes, makers of fine chemicals need flexible plants to accommodate frequent changes of product and process.

Measuring their development cycles in years and their plant lifetimes in decades, the process industries are famously conservative when it comes to adopting new technologies. Process designs do evolve, however, driven by factors including:

  • improved unit operations (process intensification) and catalysts;

  • new design methods (computer simulation and equipment networks);

  • new methods of monitoring and control;

  • new constraints on raw materials and utilities such as energy and cooling water;

  • new pressures on safety and environmental performance, including the trend towards bio-based feeedstocks;

  • new business and legislative pressures (e.g. the need to get to market faster).

Exhibitors at ACHEMA 2015 who influence process development cover a wide spectrum. They include technology licensors, engineering and construction (E&C) companies, and consultants; specialists in simulation and process control; equipment manufacturers; and chemicals producers.

Unit operations and improved chemistry

Unit operations such as chemical reactions, mixing, liquid-liquid extraction, filtration and drying form the heart of every traditional chemical process. Changes at this fundamental level tend to have the biggest effects on process development.

Batch reactors are flexible and aid product traceability, so they are traditionally favoured for fine chemicals and pharmaceuticals. Continuous reactors, on the other hand, are inherently more cost-efficient, easier to scale up, and often safer. Recent news of continuous processes for aromatic amines in the pharmaceutical industry, and intermediates used to make the herbicide glyphosate, show that the decision to use batch or continuous is still not always clear-cut.

Catalysts are a vital part of many industrial processes, and an area of rapid development thanks to improved computer simulation and automated screening technologies. A more selective catalyst can increase yields and eliminate the need for a downstream separation stage, while new catalysts can create new products. For instance, Jowat AG (Detmold, Germany) is using a process developed by Novomer Inc. (Waltham, MA, USA) to produce polypropylene carbonate (PPC) polyol, a “green” polymer that can be made from recovered carbon dioxide. The Novomer process depends on a proprietary cobalt-based catalyst.

“Process intensification” refers to techniques for reducing plant footprint or energy use by combining operations or increasing driving forces. An example is reactive distillation (see below), which can reduce plant complexity by performing chemical reaction and separation in the same column.

Machines that combine mixing, thermal processing and evaporation of viscous products are a speciality of the German-speaking countries, and several manufacturers will be displaying this technology at ACHEMA. By replacing multiple separate items of equipment and reducing or eliminating the need for solvents, these can simultaneously yield better product quality from simpler, more economical processes.

Going a step further, “reagentless synthesis” seeks to eliminate not just solvents but also reactive chemicals from the mix. Instead, reactions are driven by electricity, light, or ultrasound.

In complex plants, re-using waste heat is key to good energy efficiency, while the same is true of water re-use in regard to environmental performance, especially in regions where water is scarce. However, it is often not obvious how best to do this. The design methodology known as “pinch” can greatly improve re-use rates by matching requirements for heat and water to potential sources.

Characterisation, simulation, measurement and control

Scale-up is the central engineering problem in developing a new process. A simple example concerns a heat-producing reaction such as a nitration: going from a 100 g laboratory batch to a 1 tonne factory vessel of the same proportions means 10,000 times as much heat to remove, but less than 500 times the available surface area. The heat transfer process therefore needs to be 20 times as effective as in the lab.

The starting point for scale-up is a thorough understanding of the physical and chemical properties of the raw materials, the final products, and any intermediates and byproducts. In this area, reaction calorimetry and measurements of the physical properties of powders are examples of characterisation techniques that have seen recent progress.

However, it will never be possible to measure every relevant property in the lab. Data gathered from pilot plants (see below) and existing production plants is therefore often at a premium, and new technology can help here too. Modern process control systems and wireless networks can collect process data from even hard-to-reach locations. Techniques such as tomography, acoustic analysis, and online spectrometers and mass spectrometers (known as “process analytical technology” (PAT) in the pharmaceutical industry) help to reveal what is really going on inside steel pipes and vessels.

Of course, engineers also need a detailed understanding of the full-scale process equipment they propose to use. Mathematical modelling is increasingly helpful here, in the form of traditional process simulation (both static and dynamic), computational fluid dynamics (CFD), and the newer “multi-scale” or “multi-physics” modelling techniques.

Improved modelling also translates to better process control. At the 2014 Process Development Symposium of the American Institute of Chemical Engineers (AIChE), Professor Juergen Hahn, a native of Germany now at the Rensselaer Polytechnic Institute (Troy, NY, USA), described how modern control methods can turn reactive distillation from an academic curiosity into a practical technique. In refineries, Prof. Hahn said, reactive distillation is a promising way to remove benzene from reformate – but the close coupling between reaction and mass transfer creates complex dynamics that traditionally have been hard to understand and control. Dynamic simulation using the gPROMS modelling environment from Process Systems Enterprise (London, UK) makes this task much easier.

The continuing role of the pilot plant

No matter how carefully materials are characterised in the laboratory, new and sometimes hazardous behaviours will show up at larger scales. Their causes include contamination, corrosion, fouling, and differences in patterns of flow, mixing, and heat transfer.

Process development therefore traditionally includes a “pilot” stage between the lab and the full-scale plant. Pilot plant scales range from a few kilogrammes to a few tonnes, depending on the product. Though better modelling may allow the pilot stage to be scaled down or even skipped altogether, pilot plants will remain important for the foreseeable future.

Companies may use pilot plants for three basic reasons: to show that new processes are viable, to generate data for scale-up, and to produce enough product to interest potential customers. The latter is especially important in the pharmaceutical industry, while proof of concept and data generation are the main drivers in refineries and petrochemical plants.

Back in 2005, the AIChE announced the results of a pioneering study of how companies use pilot plants. The three-year study gathered information from 30 North American companies in the chemical, pharmaceutical, oil and gas industries. Some firms said they piloted all new processes. Others were more selective, assessing the degree of risk in scale-up through individual judgment, systematic reviews, or “stage gate” techniques in which research ideas have to pass a series of formal reviews. Unit operations such as liquid-liquid extraction and liquid mixing are still hard to model, the participants said, so pilot plant data is important for scale-up.

The drive for sustainability

Many of the design techniques discussed above improve the environmental performance of plants by reducing waste, cutting process complexity, or in the case of new catalysts by allowing processes to work at lower temperatures.

On top of this, the drive to create a bio-based chemical industry is already driving the development of many new processes. As one example of many, Bayer MaterialScience AG (Leverkusen, Germany) recently announced that it will make pentamethylene di-isocyanate, a new cross-linking agent for coatings and adhesives, with 70 percent of the carbon content coming from biomass. Production of up to 20,000 t/y is scheduled for 2016. The need to make ethanol from cellulosic waste instead of food crops depends heavily on the development of new enzymes and other separation techniques to break down woody materials.

The desire to reduce environmental impact and the need to operate in arid regions – worsened by the prospect of climate change – encourages the design of “zero liquid discharge” (ZLD) plants. Air-cooled heat exchangers and membrane-based wastewater purification processes are examples of the design techniques needed for ZLD plants.

Environmental aspects are also important in the growing use of structured software tools alongside the “soft” management skills needed to integrate the contributions of chemists, biomedical researchers, chemical engineers and other disparate cultures who join forces to develop new processes.

Shaibal Roy of DuPont in the US pointed out to the AIChE process development symposium that sustainability has now joined technical and economic feasibility as a key requirement for new processes. At ACHEMA 2015, all three aspects will be discussed; successful process industries cannot miss out on any one of them.
Live chat by BoldChat