<div dir="ltr"><div><div><div>Hello,<br>
<br>
I am having problems with L2ARC SSD drives since upgrade to omnios-10b9c79.<br>
After filling cache up l2arc size changes to 16.0E and zpool iostat -v
shows that cache is still growing (at this moment it is using 2.60T out
of 16.0E available space).<br>
After removing l2arc drive from pool and reattaching it problem disappears until cache grows up again.<br>
Problem affects two machines with similar setup (and my only two machines that runs omnios-10b9c79).<br>
<br>
For l2arc purpose I am using Intel SSD DC S3500 480GB SATA 2,5".<br>
Few system details:<br>
<br>
# uname -a<br>
SunOS nb3 5.11 omnios-10b9c79 i86pc i386 i86pc<br>
<br>
# zpool status | grep cache -A 1<br>
       cache<br>
         c1t55CD2E404B575172d0   ONLINE      0    0    0<br>
<br>
# zpool iostat -v | grep cache -A 1<br>
cache                         -     -     -     -     -     -<br>
 c1t55CD2E404B575172d0   2.60T 16.0E   735   200 1.37M 2.73M<br>
<br>
# kstat zfs::arcstats:*l2*<br>
module: zfs                            instance: 0    <br>
name:  arcstats                       class:   misc<br>
       evict_l2_cached                29975692498432<br>
       evict_l2_eligible              10552753603072<br>
       evict_l2_ineligible            1821252098048<br>
       l2_abort_lowmem                1220<br>
       l2_asize                       2856707439616<br>
       l2_cksum_bad                   584529772<br>
       l2_compress_failures           0<br>
       l2_compress_successes          102235818<br>
       l2_compress_zeros              0<br>
       l2_evict_lock_retry            409<br>
       l2_evict_reading               0<br>
       l2_feeds                       1300435<br>
       l2_free_on_write               11906910<br>
       l2_hdr_size                    34884171088<br>
       l2_hits                        765898065<br>
       l2_io_error                    103979415<br>
       l2_misses                      1910226514<br>
       l2_read_bytes                  1495882560512<br>
       l2_rw_clash                    535147<br>
       l2_size                        3757422951424<br>
       l2_write_bytes                 2984588264448<br>
       l2_writes_done                 916469<br>
       l2_writes_error                0<br>
       l2_writes_hdr_miss             65671<br>
       l2_writes_sent                 916469<br>
<br>
# fmadm faulty<br>
# fmdump -eV<br>
TIMEÂ Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â CLASS<br>
fmdump: warning: /var/fm/fmd/errlog is empty<br><br># zpool get version DATA3-BACKUP<br>NAME         PROPERTY VALUE   SOURCE<br>DATA3-BACKUP version  28      local<br><br><br></div>I found similar issue in freenas: <a href="https://bugs.freenas.org/issues/6239">https://bugs.freenas.org/issues/6239</a><br></div>With such patch as solution: <a href="https://bugs.freenas.org/projects/freenas/repository/trueos/revisions/6ec48ebf5a1596ec7d2732e891fce3f116105ae5/diff/sys/cddl/contrib/opensolaris/uts/common/fs/zfs/arc.c">https://bugs.freenas.org/projects/freenas/repository/trueos/revisions/6ec48ebf5a1596ec7d2732e891fce3f116105ae5/diff/sys/cddl/contrib/opensolaris/uts/common/fs/zfs/arc.c</a><br><br><br></div>I would like to know if this will be fixed in a stable release soon or if there will be any patch that I will be able to apply to my current environment to fix this?<br><div><div><div><br>
<div><div class="gmail_signature">Best regards,<br>Benedykt Przybyło<br></div></div>
</div></div></div></div>