I created the following clock enable block:
library ieee; use ieee.std_logic_1164.all; use ieee.numeric_std.all;
entity clock_enable is generic ( OUTPUT_CLOCK_ENABLE_PERIOD : time; INPUT_CLOCK_ENABLE_PERIOD : time ); port ( clk : in std_logic; reset : in std_logic; enable : in std_logic; clock_enable : out std_logic ); end clock_enable;
architecture behavioral of clock_enable is constant CE_CLOCK_ENABLE_TIMEOUT_COUNT : natural := OUTPUT_CLOCK_ENABLE_PERIOD / INPUT_CLOCK_ENABLE_PERIOD; signal clock_enable_i : std_logic := '0'; begin clock_enable '1', clock_enable => clock_enable_1_us_i ); clock_enable_1_ms_component : entity work.clock_enable generic map ( OUTPUT_CLOCK_ENABLE_PERIOD => 1 ms, INPUT_CLOCK_ENABLE_PERIOD => 1 us ) port map ( clk => clk, reset => reset, enable => clock_enable_1_us_i, clock_enable => clock_enable_1_ms ); end architecture structural;
The 1 us enable is correct, but the 1 ms clock enable that uses the 1 us clock enable is at the wrong frequency. When I look at the synthesis output, the same size counter (7 bits) is used for the 1 ms clock enable as for the 1 us clock enable. Does anyone have any ideas what I am doing incorrectly?
Thanks,
Jim