Showing posts with label Process development. Show all posts
Showing posts with label Process development. Show all posts

Friday, November 17, 2023

S88 XML Recipe Round Trip via the Dynochem Mixing Web App

Just when you thought you knew why the coolest link on the Internet was the Dynochem Mixing Web App, something even cooler has come along, thanks to collaboration between Scale-up Systems and the wider Autochem business unit of Mettler Toledo.

Used correctly, iControl can export an XML file containing the detailed recipe and trends for your experiments in the EasyMax, OptiMax or using RX-10.  That means the actual amounts charged and the timing, which may differ from the intended experiments set up in your ELN or elsewhere.  

Using iControl 6.2, you can now export that recipe and drag it into the Mixing Web App.  The App then reads the entire procedure, breaks out the relevant operations, calculates material properties for each operation using the Scale-up Suite Materials system, reads equipment information from the Scale-up Equipment Data Service and presents the results for Vessels 1 and 2 in the browser.

You can select Vessel 2 from your equipment network and agitation conditions are then automatically scaled at constant power per unit mass.  You can adjust fill volumes and recalculate conditions.  You can save the results for later re-use, generate a PDF report and share the results with colleagues using the URL for the page.  

If your Vessel 2 selection is from the Mettler ecosystem, you can even export the scaled recipe and drag that into iC Data Center using the new integration with Scale-up Suite.  Your designed experiment will appear moments later in iControl, ready for execution.

That's scale-up of a complex lab procedure with just a few clicks, saving time and tedium compared to current workflows, now fast and easy enough to de-risk any project.

Take a look at the video here (free Scale-up account required) to see the "S88 XML Recipe Round Trip" in action.  To learn more, take the training exercise.

This has a pretty big impact in it's own right.  Even better though, it lays a foundation for other round trips and automations, including workflows that include kinetic models and machine learning applications.  Watch this space!


Tuesday, June 21, 2022

New: Dynochem Equipment Data Service (EDS) puts equipment data at your fingertips

This month we delivered our new equipment data service (EDS) capability to more than 150 customer organizations globally.  Leading customers adopted the system shortly after release of Scale-up Suite 2 in July 2021; now we are formally going live for everyone.

This SQL database backed approach to managing your equipment data has many advantages compared to the old system of requiring users to find and import our industry-standard Excel-based template, in use since 2011.  It is also the only supported way to retrieve your equipment data into the latest version of our mixing and heat transfer toolbox after 30 June 2022.  

Features include secure user account based access control, easy access from any device and a change log for traceability.

We have made the administrators of your EDS the same people who are administrators of your Dynochem license.  We have sent your admins (custodians of your database) simple instructions to populate the service with your equipment information and make it available to you.  For users, as this capability is rolled out, you will start to see the Vessel Update button becoming active in your Dynochem 6 ribbon.  Other benefits of adoption include:

  • Your continued ability to use the latest version of the mixing toolbox with your equipment data after 30 June.  The toolbox will no longer have an Excel file Import button, so the only way to include your organization's equipment in the toolbox will be using the EDS (Vessel Update button in the ribbon)
  • The latest version of the toolbox (30 June) will include a fuller range of Mettler Toledo lab  vessels you can easily choose, apply or edit for your applications
  • Users no longer need to know ‘where the vessel database file is’, to copy and paste it's web address or to browse to locate it on the network
  • Users can access equipment information in any Excel workbook, using the Catalist and Properties buttons on the DC Excel ribbon
  • Users on any device can access and view your equipment through a simple web browser interface; they do not need Scale-up Suite installed to do this; they need only to have a scale-up account and be listed on a current valid Dynochem license
  • The EDS is a foundation for future enhancements that leverage access to equipment data for many other everyday applications
  • The EDS will support a greater number of database fields, requested by customers to better describe your broad range of equipment types, including biologics set-ups.

Otherwise, contact support@scale-up.com to find out who your admin is and here's a 1-minute (silent) YouTube video showing the EDS in action:

Dynochem: Secure access to equipment info, for users of your Equipment Data Service

Additional useful resources include:

Friday, July 2, 2021

Scale-up Suite 2 released! Plus news on Dynochem Biologics

It's been even busier than usual at Scale-up Systems recently and here's a catch-up in case you haven't seen it elsewhere yet.

We're delighted to have released Scale-up Suite 2 today (big thanks to Steve Hearn and all of our Development team!) and look forward to customers rolling this out over the next little while.  There are a host of enhancements as covered in our previous post and some new product options, notably RunScript Automation:

  • This exposes calls to the Dynochem and Reaction Lab APIs so that customers in our Digitalization Pilot Program can apply our scripts to create autonomous workflows and build their own scripts to implement high-throughput modeling alongside experimentation. Check out a preview video here [5 minutes].
Dynochem 6 has been released as a component of Scale-up Suite 2.

We've also separated the Dynochem model library for biologics into its own dedicated Resources site and created a new product option around this functionality:

  • It's called Dynochem Biologics and is focused on the development and manufacturing of large molecule therapeutics using cell culture (USP), downstream purification (DSP) and fill/finish operations.  
  • We've built in all of the usual supports you expect from us, including self-paced training and KB articles.  
  • Dynochem Biologics already works with Scale-up Suite 1 and we'll be moving all biologics users across to this platform at the end of July.

Actions for you:

  • Visit our Installer page to upgrade to Scale-up Suite 2 or have your IT colleagues do so if that's necessary in your organization.  Remember, our cloud licensing means you don't need any on-premise license server and Suite 2 works immediately with the cloud licenses (and content) you already use.
  • Contact your Scale-up Systems representative to find out more about our Digitalization Pilot Program or to explore Dynochem Biologics.

Saturday, April 24, 2021

Get ready for Dynochem 6 and Scale-up Suite 2: Modeling for Everyone

Last week, Peter Clark gave a preview of new features coming with Scale-up Suite 2.  If you missed the event live, as usual you can catch the recoding in the resources site here.

Peter showed there is something for everyone in the new release.  Whatever modality of drug/ active ingredient you develop or make, whether a small or large molecule or somewhere in between, whether made with cell culture or synthetic organic chemistry, your teams and your enterprise can obtain value daily from our tools.  That takes us several steps closer to our vision to positively impact development of every potential medicine.

Scale-up Suite 2 includes:

  • Powerful equipment calculators for scale-up, scale-down and tech transfer, leveraging our industry standard vessel database format
  • Rigorous material properties calculators for pure components, mixtures and your proprietary molecules
  • Empirical / machine learning tools, to build and use regression models from your data with just a few clicks; including support for DRSM
  • Mechanistic modeling of any unit operation, in user-friendly authoring and model development environments
  • Hybrid modeling, combining the best of both worlds
  • Interactive data visualization, including parallel coordinates and animated contour plots for multidimensional datasets
  • New features to make modeling faster, more productive and more enjoyable, incorporating ideas suggested by customers and from our own team
  • New capabilities for autonomous creation of models, parameter fitting and process optimization 'headless' on the fly, as well as incorporation of real time data and access from any device.
We believe that:
  • Interdisciplinary collaboration accelerates process development and innovation
  • Models facilitate collaboration and knowledge exchange
  • Interactive, real-time simulations save days and weeks of speculation
  • Models are documents with a lifecycle extending from discovery to patient
  • Model authoring tools must be convenient and easy to use
  • Teams needs models that are easily shared
  • Enterprises need tools that embed a modeling culture and support wide participation.
In other words, modeling should be an option for Everyone.  To make that a reality for you, we support our software tools with:

  • an Online Library, containing hundreds of templates, documentation and self-paced training
  • Free 1-hour on-line training events monthly
  • Half-day and full day options for face to face training, available globally
  • A free certification program to formally recognize your progress and skills
  • Outstanding user support from PhD qualified experts with experience supporting hundreds of projects like yours
  • A thriving user community, with round tables and regular customer presentations sharing knowledge and best practices.

We're celebrating 21 years serving the industry this year, supporting more than 20,000 user projects annually, for more than 100 customers all over the world, including 15 of the top 15 pharma companies.

If you're an industry, academic or regulatory practitioner, we invite you to join our user community and start to reap the benefits for your projects.

Tuesday, March 23, 2021

Bioreactor mass transfer: kLa (O2) versus kLa (CO2)

kLa is an emotive term for many in process development.  It evokes a certain mystery for those whose background is not chemical engineering, a 'TLA' they hear over and over again.  Obtaining values for this scale-dependent 'mass transfer' parameter can be a significant undertaking, whether by experiments, empirical correlations or even CFD.  We provide purpose-designed tools to support fitting kLa to experimental data and for estimation using established correlations.  The experimental approach is the subject of this post.

The dominant experimental technique is the dynamic gassing out method, where dissolved gas concentration is followed versus time using a probe in the liquid phase.  A shortcut method allows kLa to be backed out from a semi-log plot; an implicit assumption here is that there is an abundance of gas.  A more rigorous approach that we advocate fits kLa to a model tracking multi-component mass and composition in both the liquid and gas phases.

The shortcut method contributes to confusion about kLa(O2) versus kLa(CO2), two important gases in cell culture.  Dissolved CO2 can be followed using pH probes.  Practitioners sometimes report separate values for kLa(O2) and kLa(CO2), with kLa(CO2) typically lower and insensitive to agitation.

CO2 is much more soluble than O2 and the two mass transfers are usually in opposite directions in a bioreactor: O2 from gas to liquid and CO2 from liquid to gas.  Incoming air bubbles become saturated with CO2 after a relatively short period of contact, whereas they continue to liberate O2 for most or all of their contact time.  That leads to different sensitivities of dissolved O2 and CO2 to agitation and gas flow rate; and therefore different abilities to measure something close to kLa.  A very nice study of the gas phase in bioreactors by Christian Sieblist and colleagues from Roche bears out this trend.

Practitioners report that successful bioreactor operation and adequate control over both O2 and CO2 (and hence pH) depends strongly on agitation in the case of O2 and gas flow rate in the case of CO2.  In fact, it's a spectrum and kLa and gas flow rate may both be somewhat important for both responses and the particular combination of kLa and gas flow (Qgas) determines the sensitivities for both gases.

We made some response surface plots from a series of gassing out simulations to illustrate.  These show the final amount of dissolved gas in solution at the end of each experiment, when kLa and Qgas are varied systematically in a 'virtual DOE'.  The initial liquid contained no O2 and some dissolved CO2 that was stripped during the experiment; the gas feed was air, so that dissolved O2 increased during the experiment.

Dissolved O2 at the end of a set of kLa measurement experiments in which kLa and Qgas were varied. The final O2 concentration is always sensitive to kLa and only sensitive to Qgas at very low gas flow rates. 

Dissolved CO2 at the end of a set of kLa measurement experiments in which kLa and Qgas were varied. The final CO2 concentration depends only on Qgas at low gas flows; and is sensitive to kLa only at relatively high gas flows. 

Transient concentrations of O2 and CO2 at low gas flow respond differently to changes in kLa.  In this illustration kLa has been increased between runs from 7 1/hr (dashed line) to 21 1/hr (solid line). The dissolved oxygen profile responds but the CO2 profile remains unchanged (click to enlarge).  Clearly, kLa(CO2) cannot be inferred from these data.

Wednesday, January 27, 2021

Dynochem biologics model library released

Many thanks to customers who engaged with Scale-up Systems as we "built, broke and bettered" our biologics model library in the run-up to release late last year.

More than one hundred biopharmaceutical companies in the Scale-up Suite global user community can now access the tools for immediate use here (https://dcresources.scale-up.com/?q=bio).  An overview of the biologics library is available here.

We expect each tool to grow and be refined by the repeated use that is typical of customer activity and we look forward to supporting more users in taking up the tools in their daily work.

Much like the small molecule opportunity, mechanistic modeling has great potential to accelerate the development of large molecules by shortening development time, making best use of experiments and anticipating manufacturing challenges.  Ours is the first fit-for-purpose and comprehensive mechanistic model library to be built and released in this space, another first of which we are very proud.

Using the Dynochem biologics library delivers daily benefits in development and scale-up while creating digital twins to support your digitalization strategy

Training opportunities using the new tools will be available at regular intervals this year.  Let us know if you'd like a dedicated session for your company or site.

Feel free to share this post with anyone you think may benefit.

Friday, September 25, 2020

Inspired by the industry response

Many of our posts on the blog this year have been about the pandemic, predicting its course and interpreting reported data for cases and deaths.  

We have seen that population level Dynochem models have been sufficiently accurate to describe the data for each country and quantify the potential future impact of the outbreak as well as the effectiveness of non-pharmaceutical measures, such as lockdowns and the wearing of masks.  

Our models for the outbreak will remain available to the user community on our COVID site.  We do not plan to further develop or update the models for the foreseeable future.  

New content on the blog will return to our core focus, positively impacting the development of medicines by our customers, the global pharmaceutical industry.  

We are proud to serve the pharmaceutical industry, supporting daily core business activities at more than 100 organizations that develop or make medicines. The industry response to COVID-19 has been inspiring and no less than we expected, having worked with some of these companies for two decades.

All of our normal activities including software development, user support and training have continued in a fully operational state and we have seen increased activity from customers both using and learning the tools. Of course we are delivering all events on-line for now. Public training events are half price during the outbreak and we have been offering training licenses to customers delivering their own internal curriculum.

We have ramped up our own support for unit operations likely to be involved in manufacturing vaccines and treatments, including bioreactors and lyophilization.

We would be delighted to hear from members of the community anytime if you have ideas or suggestions as to how we could do more, by email to support@scale-up.com.

Reaction Lab in action
Our Reaction Lab software helps chemists develop kinetic models, maximize yield and minimize impurities.

Thursday, February 20, 2020

Ed Paul

We were sorry to hear recently that Ed Paul has died.

On a personal level, we shared a lot of laughs and discussions at meetings and conferences of the North American Mixing Forum.

Professionally, Ed was the lead author of the Handbook of Industrial Mixing and before that had an outstanding chemical engineering career with Merck.  Ed's observations of mixing effects on homogeneous reactions spawned a whole new field of research and ultimately led to understanding of phenomena such as micromixing and mesomixing.
Ed's 1971 paper following his PhD thesis spawned a whole new field of chemical engineering research
Scale-up Systems was delighted to host Ed in Dublin for a few days in August 2002, when he shared his experiences of Pharmaceutical chemical development and scale-up and delivered extensive notes that we used to strengthen our model library and knowledge base for customers.
Some notes from Ed's consulting visit to Scale-up Systems in Dublin, 2002
Ar dheis Dé go raibh a anam.

Monday, February 3, 2020

2019 round-up and looking forward to 2020

To ardent watchers of this blog - you know who you are - apologies for the pause in postings.  A lot has been happening since August 2019 as we follow our mission to accelerate process development and positively impact the development of every potential medicine.

We'll be posting more regularly in 2020, with lots of news and new capabilities in the pipeline.  In the meantime, here's a catch-up on some items from the latter part of 2019 and a picture that summarizes a few of them:

A few Scale-up Systems and Scale-up Suite highlights from the end of 2019; more details below.
  • We presented our sponsored AIChE award for Outstanding Contribution to QbD for Drug Substance to the 2019 winner, Zoltan Nagy of Purdue University, at the Annual Meeting in Orlando [pic - top right]
  • We nominated John Peterson of GSK for the corresponding Pfizer-sponsored Drug Product award, for his excellent work on statistics of design space, and he won.  Was great to catch up with John at the awards session [pic - top left]
  • Andrew Bird presented statistically rigorous calculations of design space for three common unit operations; and ways to dramatically accelerate the calculations [pic - top centre]; watch for the details in a 2020 webinar
  • A growing band of Dynochem and Reaction Lab users are keeping warm this winter in our 'beanie' hats, helping the environment with our keep-cups and looking forward to summer in our poloshirts [pic - bottom centre]
  • And we've updated our certification programs to a fully automated on-line system with randomized questions.  Visit the Resources site and search for 'certified' to find out more and take the test.

Sunday, August 18, 2019

Part 6 of 6: Join us on the journey to 'Industry 4.0'

If you've been following this series on Industry 4.0, you'll know by now that multiple technologies maturing now or soon are creating significant opportunities for acceleration and transformation in chemical development and manufacturing.

If you already work with our company, you'll have seen the model libraries, software tools, training materials and user support that make everyday applications a reality.  If you haven't seen them yet, most likely your company is one of the over 100 with access - contact us if you cannot easily locate your internal champion.

Over 100 companies engaged in the development and commercialization of new medicines rely on our tools at over 400 sites worldwide
We're always listening for feedback and even since this series started have delivered updates in Scale-up Suite to extend support for automation applications, with parallel processing for more tasks and new function calls to return 'data cubes' as arrays in memory from large simulation sets.

We've received excellent feedback from our user base in the fast-growing CMO space, that leveraging our vessel database format for digital tech transfer is helping to reduce costs and failures, increase speed to make room for more customer projects, reduce deviations and modernize the approach to on-boarding new projects.

We hope you've enjoyed this series of six postings.  If you want to hear more, join us for our webinar on this topic in early September:

[Webinar] Opportunities to accelerate Chemical Development as part of "Industry 4.0"

Thursday, July 11, 2019

Part 5 of 6: Opportunities to accelerate projects

You may already know that the most commonly used noun in the English language is "time".  In today's world, many of us feel almost permanently under time pressure and we talk about not having enough time for all kinds of things we'd like to do.  Not having time takes on a whole new meaning for patients with life changing medical conditions, reminding us in chemical development and scale-up that opportunities to accelerate our work and commercialization of new medicines should be taken with both hands.

Achieving acceleration using modeling (e.g. Dynochem or Reaction Lab) is already well covered by extensive case studies from customers in Dynochem Resources.  Acceleration using automation of modeling and connection of modeling to other workflows is the subject of this post.  In our core software development team, we have thought a  lot about these future applications and taken steps to support their realization, providing a platform and the ‘hooks’ needed to link with other technologies.

A basic platform is the ability to automatically generate and run a large number of virtual experiments.  We use parallel processing to execute the simulations as illustrated in the short animation below.  The automation calls are exposed and may be 'scripted' and run by other programs (e.g. Python) as part of an integrated workflow.
Chemists and engineers can leverage automated generation and execution of a large set of virtual experiments with parallel processing and collation of results in convenient Excel tables and contour plots.
Tasks involved in model building may also be scripted/ automated in Dynochem 5 and Reaction Lab.  For example, area percent data may be entered in a model, a set of kinetic parameters fitted and many simulations carried out, all without human intervention.  To do this requires some scripting / code at several stages in the workflow.  Cloud computing resources (Azure or AWS) may be used for execution, leveraging our cloud licensing.

For example, the animation below shows scripted fitting of three UA (heat transfer characterization) values to three solvent tests using Dynochem 5.  This takes a short time to fit the parameters needed for each of three liquid levels in a reactor.  (The ‘fit’ button is just for demo purposes and normally the fit would be started from another scripted workflow process).
Scripted parameter fitting is possible using new function calls built into Dynochem 5 and Reaction Lab; this example illustrates automated heat transfer characterization (UA) and the techniques are equally applicable to e.g. chemical kinetics.
Additional opportunities exist in leveraging information from electronic lab notebooks (ELN) to create models for users that are already populated with features such as chemical structures and experimental data.  In a move beyond existing relatively crude self-optimizing reactor algorithms, customers are interested in closing the loop between modeling and experimentation, using model outputs to set up and execute the next experiment(s) in a fully automated loop.

Contact our support team if you'd like to discuss any of these applications further for use inside your organization.

Friday, June 7, 2019

Part 4 of 6: Where will the models come from?

If mechanistic modeling is to become a focal point in the project lifecycle, you have to address the question of where the models will come from.  In this context, by 'model' we mean i) the set of equations to be solved, ii) in executable form, with iii) initial values, iv) fitted parameter values where needed and v) experimental data to assess model accuracy.

Q: Who can create these models and when does it make sense for them to do so?
A: For tangible benefits the creators and users should be the same practitioners / project teams that own and run the development projects.  Not some specialists in an ivory tower that are focused only on modeling.  Model development should occur before and during experimentation.  Modeling should not be a 'post-processing' activity that occurs too late to add value or when the time window for data collection has passed.

In Dynochem 5 and Reaction Lab, we have streamlined the process in i) to v) so that this vision is achievable.  We include further notes on the individual steps below.

Steps i) to v) can be accomplished in a snap for chemical reactions using Reaction Lab.  The resulting model can be leveraged over and over during the project lifecycle.
Item (i) may be clear and simple for certain common unit operations like heating/ cooling and perhaps filtration; for many other operations, identifying which equations to solve may be iterative and challenging.  For models of fairly low complexity, like distillation, while the equation set may be obvious it is unwieldy to write down for multi-component systems including the energy balance.  For models of chemical reactions, the set of elementary reactions will not become clear until the full cycle i)-v) has been repeated more than once by knowledgable process chemists.

Unlike some other tools, we do not force users to populate 'matrices' just to define reactions and reaction orders (!)

Item ii) is an obstacle for practitioners who only have access to spreadsheets, or specialized computing/coding environments.  These force the user to develop or select a specific solution method and run risks of significant numerical integration inaccuracies.  Even then, simulations will lack interactivity and parameter estimations will require scripting or complex code.  Some 'high-end' engineering software tools present similar challenges, lacking comprehensive model libraries and forcing users to write custom models, delve into solution algorithms and confront challenges such as 'convergence' that feel highly tangential to project goals.

Item iii) should be easy for practitioners and in practice it can be so, if the software supports flexible units conversion (in and out of SI units) and contains supporting tools to provide initial estimates of physical properties and equipment characteristics.

Item iv) requires the model to be run many times and compared with experimental results.  Specialized algorithms are needed to minimize the gap between model predictions and experimental data.  When multiple parameters must be fitted to multiple responses in multiple experiments, this gets close to impossible in a spreadsheet model and general-purpose mathematical software environments.

Item v) is mainly the province of the experimenter and once each experiment has been completed, requires an easy mechanism for aggregating the data, with flexible units handling (including HPLC Area, Area%) being a major help.

And so to answer the question in the title of this post: You guessed it!  We expect the majority of chemical reaction and unit operation models in Pharma to continue to be developed using our tools in preference to home-made or overly complex environments.  As the volume of modeling activity grows with Industry 4.0 and related developments, we already see this trend becoming more pronounced, with many practitioners needing to use the same model over a project lifecycle, requiring speed and ease of use as well as accuracy and rigour.

Friday, May 10, 2019

Post 3 of 6: Central role of mechanistic modeling in Chemical Development

Chemical Development is a complex and challenging undertaking, involving a large effort from multi-disciplinary teams, sometimes battling Mother Nature, with compressed timelines and limited material for experimentation.  There is a broad spectrum of approaches to this challenge, including new lab instruments, use of robotics and automation, outsourcing certain types of development or operations and use of statistical and mechanistic modeling.  Companies also experiment to find the best organization structure for this function and frequently separate departments specialize in Analytical (Chemistry) Development, Chemical (Process) Development, Technology Transfer and preparation of Regulatory filings.  Collaboration among these groups helps achieve development goals.

Figure 1 (click to enlarge): A simplified representation of chemical development today, including the scale and locus of statistical and mechanistic modeling
Figure 1 is a much simplified graphical representation of the activities involved.  There is a large reliance on experiments.  Groups involved in process definition and optimization are currently the main users of both statistical and mechanistic modeling.  Technology transfer increasingly involves working with external partners remotely.  Data search and gather, including data integrity reviews and preparation of regulatory filings, are mostly manual processes.  The disparate nature of activities and the needs for specialization make them somewhat siloed, with risks of duplication and dilution of effort.  For example, an experimental program may be repeated if the first program missed some key information; or repeated by a CRO to answer new questions that have arisen; or repeated by a CMO in order to accomplish successful tech transfer.  None of these data may be harnessed effectively and shared to answer future questions.

Leading companies are changing their approach to chemical development and bringing mechanistic process modeling on stream earlier and more centrally than before.  The idea is not new but advances in a range of technologies (see earlier posts) and the momentum of 'Industry 4.0' are helping to fuel the transformation.  At a task level, using a model to design the right experiments reduces overall effort.  At a project level, the model provides a place to capture the knowledge and reuse it in future.  At an organization level, modeling provides a structured, reusable and digital approach to information sharing and retrieval.  For example, questions can be answered in real time, without experimentation, interactively when they arise, even live in a meeting or webcon, sparing delays, speculation and doubts, allowing faster progress.

Figure 2 (click to enlarge): Future shape of chemical development activities, with mechanistic process models as the focal point for information capture and reuse.
The pieces in Figure 1 are rearranged in a natural way in Figure 2 as a cycle that captures and makes the most of information generated during each chemical development activity, including modeling.  Additional items have been added to reflect technologies that are relatively new to Pharma, including continuous manufacturing and feedback process control; opportunities to apply either or both of these in chemical development or full scale manufacturing can be evaluated using a mechanistic process model.  Therefore the mechanistic model takes up a central position and is the focal point in the new chemical development process.

It will take some time before Figure 2 reaches its full potential.  The throughput of models in chemical development organizations is already increasing as model building tools become easier to use and more prevalent.  We're delighted to be able to lead the way with Scale-up Suite.

Figure 2 also includes some great opportunities to automate workflows.  We'll discuss some of these in a later post.  

Wednesday, May 1, 2019

Post 2 of 6: A brief history

The Wall Street Journal ran an article in September 2003, entitled "New Prescription For Drug Makers: Update the Plants", comparing and contrasting pharma manufacturing techniques with other industries.  The subtitle ran, perhaps unfairly, "After Years of Neglect, Industry Focuses On Manufacturing; FDA Acts as a Catalyst".

Our DynoChem software entered the industry a few years prior, the prototype having been developed as a dynamic simulator within Zeneca, so that users could "create a dynamic model without having to write differential equations".  We first proved that the software could be used to solve process development and manufacturing problems (e.g. with hydrogenations, exothermic additions), then rewrote the source code and began to add features that made modeling by non-specialists an everyday reality.

There have been many pharma industry leaders who have recognized the potential for modeling to help modernize development and manufacturing.  One example is Dr Paul McKenzie and his leadership team at Bristol-Myers Squibb (BMS) at the time, who cited the Wall Street Journal piece in an invited AIChEJ Perspectives article and also in presentations like this one at the Council for Chemical Research (CCR) in December 2005 - you can get the full slide deck here.

Cover slide from presentation by Paul McKenzie of BMS at CCR Workshop on Process Analytical Technology (PAT), December 13, 2005, Rockville, MD
Today, while the landscape for data storage, sharing and visualization has moved ahead significantly, with the emergence of ELN, cloud and mobile, the chemical and engineering fundamentals of defining and executing a good manufacturing process remain the same:

Some capabilities required to develop robust and scalable processes, from the 2005 CCR presentation
Our Scale-up Suite extends these capabilities to more than 100 pharma development and manufacturing organizations worldwide, including 15 of the top 15 pharmaceutical companies.  This broad and growing base of users, armed with clean and modern user interfaces, calculation power and speed in Reaction Lab and Dynochem 5, provides a firm foundation for the next wave of industry transformation.

We're always delighted to hear what users think.  Here are some recent quotes you may not have seen yet:

  • "If you can book a flight on-line, you can use Dynochem utilities" [we like this especially because we hear that using some other tools is like learning to fly a plane]
  • "Our chemists are thoroughly enjoying the capabilities of Reaction Lab software and are quite thrilled with the tool".

In the next post, we will look at the increasingly central role of mechanistic modeling in process development.

Monday, April 29, 2019

Post 1 of 6: Exciting times in Chemical Development

It's an exciting time to be part of the Pharma industry's chemical development ecosystem, with new opportunities being created and adopted to accelerate development of new medicines.  This is the first in a short series of posts that will focus on the role of predictive, mechanistic modeling in the industry's transformation.

The much-talked about 'Industry 4.0' phenomenon has led to the creation of awkward terms such as 'digitalization' and one positive consequence of the hype is that it has somewhat aligned the goals of senior managers, systems integrators, consulting companies and industry vendors.  We especially liked the review by Deloitte that uses the term 'exponential technologies' to group many of the developments that underpin current transformation opportunities:

Snapshot of exponential technologies covered in the Deloitte study, Exponential Technologies in Manufacturing
We'll be highlighting the role of digital design, simulation & integration, technologies that our customers have practiced on a growing scale for nearly twenty years.  We expect the rate of growth to increase quite sharply as new developments, like Reaction Lab, make adoption easier and simulation is integrated with other developing technologies.

If the above whets your appetite, watch this space for the next piece in this series.

As always, customers can contact our support team to discuss immediate applications.

Friday, February 22, 2019

Dynochem 5 released as part of the new Scale-up Suite

In our 14 February webinar, Scale-up Systems was delighted to announce the release of Dynochem 5 as part of the new Scale-up Suite, which also includes Reaction Lab and Numero Chem.
Scale-up Suite includes Dynochem and new products Reaction Lab and Numero Chem
Reaction Lab: Kinetics meets ELN

This is the culmination of great work by our software development team, inspired by customer feedback and led by Dr Steve Hearn.

High-level details about the components in Scale-up Suite can be found at the new look scale-up.com website.  Members of scale-up.com can get more detail and access to the tools via the Dynochem Resources and Reaction Lab Resources websites.

We've started a program of weekly 30-minute webinars to talk through the new features and hope that customers and prospective customers can make some of those live (or watch the recordings) over the next month or two.

Your Dynochem 4 content will work in Dynochem 5 and you should plan to upgrade as soon as practicable for you.  Expect a host of improvements in speed, ease of use and accuracy, the latter especially for material properties.

Use the links at the side of this blog to explore more.  As always, we'd love to hear your feedback to support@scale-up.com.

Thursday, July 19, 2018

Great set of guest webinars so far this year, more to come, including 'Bourne' sequel; enjoy on your phone

We hope you've been enjoying our free to attend guest webinar program this year as much as we have.

To date in 2018, Syngenta, Johnson Matthey, Nalas, Amgen and Teva have covered topics from one end of a manufacturing stage to the other, addressing synthesis, experimental design, process safety, crystallization and drying.

Who needs Netflix and HBO?  You can enjoy last week's Guest Webinar by Tom Corrie, Syngenta: “Accelerating Active Ingredient Development with Early Stage DynoChem Simulations", and all other webinars in our Guest series, on your smartphone / mobile device, any time of the day or night. [Screenshot from iPhone 8 shown here]

A reminder that you use your phone to both attend live (Adobe Connect app) and/or enjoy recordings (MP4 format, see iPhone screenshot above).  In line with the spirit of GDPR regulations, the identities of our attendees are now anonymized in recordings.

We're impressed by the innovative ways in which users apply our tools and also their openness in discussing process development challenges they face and the solutions they have found.  And there's more to come this year, with Sarafinas Process & Mixing Consulting on use of the legendary 'Bourne Reactions', UCD on continuous crystallization, and AstraZeneca on centrifugation, events all in the schedule.

Thanks to Steve Cropper and Peter Clark of our team for continuing to line up a great annual program.  2019 is already looking good, with Flow Chemistry and Drying webinars already planned.

Thursday, March 29, 2018

BioPharma Europe Initiative: giving Pharma manufacturing a distinctive voice in Brussels

Through our involvement with the pre-competitive collaboration centre SSPC, we have attended a number of events organized by BioPharma Europe, a growing SSPC-led initiative that is raising awareness in the European Parliament and Commission of the unique role, position and needs of the European Pharma industry and seeking policy initiatives that support a strong future for Pharmaceutical Manufacturing in Europe.

Source: EFPIA
After a good start, this group is now seeking to build support from a wider network of European pharma industry stakeholders in the next phase of discussion with Europe's research and regulatory policymakers, such that future policy decisions support this strategically important industry in the globally competitive landscape.

At Scale-up Systems, we are proud of our excellent relationships with pharma companies, CMOs and CROs and are delighted to bring BioPharma Europe to the attention of our customers.  Organizations wishing to find out more about BioPharma Europe should contact Aisling Arthur at SSPC.

Tuesday, February 27, 2018

A PSD trend that is not widely reported - thanks, Orel

While supporting customers who apply DynoChem for crystallization modeling, we have seen several cases where some of the familiar quantiles of the PSD (D10, D50, D90) reduce with time during at least the initial part of the crystallization process.

On reflection one should not be that surprised: these are statistics rather than the sizes of any individual particles.  In fact, all particles may be getting larger but the weighting of the PSD shifts towards smaller sizes (where particles are more numerous, even without nucleation) and in certain cases, this causes D90, D50 and maybe even D10 to reduce during growth.

Last week we had an excellent Guest Webinar from Orel Mizrahi of Teva and Ariel University, who characterized a system with this behaviour, with modeling work summarised in the screenshot below.

D10, D50 and D90 trends in a seeded cooling crystallization: measured data (symbols) and model predictions (curves).
There was a good discussion of these results during Orel's webinar and we decided to make a short animation of a similar system using results from the DynoChem Crystallization Toolbox to help illustrate the effect.
Cumulative PSD from the DynoChem Crystallization toolbox, showing the evolution of PSD shape during growth from a wide seed PSD.  The movement of quantiles D10, D50 and D90 is shown in the lines dropped to the size axis of the curve.
In this illustration, the reduction in  D50 can be seen briefly and the reduction in D90 continues through most of the process.  From the changing shape of the curve,  with most of the movement on the left hand side, most of the mass is deposited on the (much more numerous) smaller particles.

We see this trend even in growth-dominated systems, when the seed PSD is wide.

Wednesday, January 24, 2018

Run typical crystallization experimental design in silico using DynoChem

Faced with challenging timelines for crystallization process development, practitioners typically find themselves running a DOE (statistical design of experiments) and measuring end-point results to see what factors most affect the outcome (often PSD, D10, D50, D90, span).  Thermodynamic, scale-independent effects (like solubility) may be muddled with scale-dependent kinetic effects (like seed temperature and cooling rate or time) in these studies, making results harder to generalize and scale.

First-principles models of crystallization may never be quantitatively perfect - the phenomena are complex and measurement data are limited - but even a semi-quantitative first-principles kinetic model can inform and guide experimentation in a way that DOE or trial and error experimentation can not, leading to a reduction in overall effort and a gain in process understanding, as long as the model is easy to build.

Scale-up predictions for crystallization are often based on maintaining similar agitation and power per unit mass (or volume) is a typical check, even if the geometry on scale is very different to the lab.  A first principles approach considers additional factors such as whether the solids are fully suspended or over-agitated, how well the heat transfer surface can remove heat and the mixing time associated with the incoming antisolvent feed.

The DynoChem crystallization library and the associated online training exercises and utilities show how to integrate all of these factors by designing focused experiments and making quick calculations  to obtain separately thermodynamic, kinetic and vessel performance data before integrating these to both optimize and scale process performance.

Users can easily perform an automated in-silico version of the typical lab DOE in minutes, with 'virtual experiments' reflecting performance of the scaled-up process.  Even if the results are not fully quantitative, users learn about the sensitivities and robustness of their process as well as its scale-dependence.  This heightened awareness alone may be sufficient to resolve problems that arise later in development and scale-up, in a calm and rational manner.  Some sample results of a virtual DOE are given below by way of example.

Heat-map of in-silico DOE at plant scale agitation conditions, showing the effects of four typical factors on D50
The largest D50 is obtained in this case with the highest seeding temperature,  lowest seed loading and longest addition (phase 1) time. Cooling time (phase 2) has a weak effect over the range considered.
Click here to learn how to apply these tools.

ShareThis small