From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: by sourceware.org (Postfix, from userid 48) id DB9713948458; Thu, 11 Jun 2020 14:12:28 +0000 (GMT) DKIM-Filter: OpenDKIM Filter v2.11.0 sourceware.org DB9713948458 DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gcc.gnu.org; s=default; t=1591884748; bh=H9k+tjVxj53ETOA0GIiilvO/thzmKy+9FZpZE8XcRC0=; h=From:To:Subject:Date:In-Reply-To:References:From; b=NJ5IsISSppvp4YRtwyOfde5mvl7wBEpDvBPFcqBjCV3/i3+7MLBaXChB24R517Hy4 jUR9F7p00NOimS8ui/2aa+kVzpN4X4me/K/w3/D6zSpc9C22+MZ5dnYcVrGNRNAPY/ PDptAMME/+iwGIXVpTVg/V7vP9SZtYASLN4B6sgQ= From: "qing.zhao at oracle dot com" To: gcc-bugs@gcc.gnu.org Subject: [Bug gcov-profile/95348] GCC records zero functions and modules in the profiling data file, ICC does NOT Date: Thu, 11 Jun 2020 14:12:28 +0000 X-Bugzilla-Reason: CC X-Bugzilla-Type: changed X-Bugzilla-Watch-Reason: None X-Bugzilla-Product: gcc X-Bugzilla-Component: gcov-profile X-Bugzilla-Version: 11.0 X-Bugzilla-Keywords: X-Bugzilla-Severity: normal X-Bugzilla-Who: qing.zhao at oracle dot com X-Bugzilla-Status: WAITING X-Bugzilla-Resolution: X-Bugzilla-Priority: P3 X-Bugzilla-Assigned-To: marxin at gcc dot gnu.org X-Bugzilla-Target-Milestone: 11.0 X-Bugzilla-Flags: X-Bugzilla-Changed-Fields: Message-ID: In-Reply-To: References: Content-Type: text/plain; charset="UTF-8" Content-Transfer-Encoding: quoted-printable X-Bugzilla-URL: http://gcc.gnu.org/bugzilla/ Auto-Submitted: auto-generated MIME-Version: 1.0 X-BeenThere: gcc-bugs@gcc.gnu.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: Gcc-bugs mailing list List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-List-Received-Date: Thu, 11 Jun 2020 14:12:29 -0000 https://gcc.gnu.org/bugzilla/show_bug.cgi?id=3D95348 --- Comment #34 from Qing Zhao --- >=20 >> Though still bigger than what ICC generated. >=20 > Yep, but we should be only 2x bigger right now? Yes, around 2-3 times bigger, much better now. >=20 > Can you please test the parallel merging script? I can merge 10GB gcov fi= les > (5000 runs with 264 files each) in about 50s. I will make the request soon (I don=E2=80=99t have the permission to do thi= s). Might might take some time for others to do this.=