- This topic has 2 replies, 2 voices, and was last updated 4 years, 9 months ago by whiskyjim.
- February 26, 2019 at 5:25 pm #3773whiskyjimModeratorKarma: 262Rank: Jedi
I’m trying to model the mash cooking process in the distillery. For the bourbon mash (mixture of water and grains), cooking is a two-step process. Step 1 – the mash is heated using steam injection to 206°F and maintained at that temperature for a specific amount of time. Step 2 – the mash is cooled to 148°F using a chilled water loop then maintained at that temperature until it is time to transfer the mash to a fermenter.
I’ve been able to model the behavior of the cooking vessel heating up by putting the CVEU output of the PIDE through a DEDT then LDLG then back to the PV of the PIDE. When I run the simulation, the trend looks like the typical exponential (1-exp(-t)) curve (think charging capacitor).
My problem is modeling the cool down portion of the cycle. With the current set up, the simulated “cooling” cycle is too fast for the logic I’m trying to test. I need a way to simulate an exponential decay curve (think capacitor discharging).
I’d appreciate any ideas anyone can throw at me.
JimFebruary 27, 2019 at 12:38 pm #3774AnonymousKarma: 0Rank: Padawan
Maybe have a DEDT and LDLG for both heating and cooling adjust the setting of the DEDT and LDLG for the cooling to slow it down. Use a SEL with a heating/cooling trigger to determine which one is used as the PV for the PIDEFebruary 27, 2019 at 1:43 pm #3776whiskyjimModeratorKarma: 262Rank: Jedi
I thought about doing something like what you suggested but I was holding out for a solution that wouldn’t require me to do any “switching.” I’ll implement this approach and see how it goes.
- You must be logged in to reply to this topic.